Apr 16 18:14:35.935693 ip-10-0-138-175 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:14:35.935705 ip-10-0-138-175 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:14:35.935712 ip-10-0-138-175 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:14:35.935952 ip-10-0-138-175 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:14:46.137183 ip-10-0-138-175 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:14:46.137202 ip-10-0-138-175 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d6a4c49decb7411c81b53ee421b9a1cd -- Apr 16 18:17:06.862255 ip-10-0-138-175 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:17:07.305003 ip-10-0-138-175 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:07.305752 ip-10-0-138-175 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:17:07.305752 ip-10-0-138-175 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:07.305752 ip-10-0-138-175 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:17:07.305752 ip-10-0-138-175 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:07.307341 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.307237 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:17:07.310288 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310274 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:07.310288 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310287 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310291 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310295 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310298 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310301 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310304 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310307 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310309 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310312 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310315 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310318 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310320 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310323 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310326 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310329 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310332 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310336 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310340 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310348 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:07.310344 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310352 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310355 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310358 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310360 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310363 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310365 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310368 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310370 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310373 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310375 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310378 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310380 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310383 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310385 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310389 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310391 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310394 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310396 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310399 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310401 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:07.310784 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310404 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310406 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310408 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310411 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310413 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310416 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310418 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310421 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310423 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310426 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310428 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310431 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310433 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310436 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310439 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310442 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310445 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310447 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310450 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310452 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:07.311296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310455 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310457 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310459 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310462 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310464 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310467 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310470 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310472 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310474 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310477 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310479 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310482 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310485 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310488 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310490 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310493 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310496 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310499 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310502 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310506 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:07.311766 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310508 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310511 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310514 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310516 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310519 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310521 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310865 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310870 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310872 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310875 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310878 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310880 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310883 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310886 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310888 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310891 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310893 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310896 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310900 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:07.312259 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310904 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310906 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310909 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310912 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310914 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310917 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310920 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310923 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310925 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310927 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310930 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310933 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310935 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310938 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310940 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310943 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310945 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310948 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310950 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310953 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:07.312704 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310956 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310959 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310961 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310964 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310966 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310968 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310971 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310974 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310976 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310978 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310981 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310983 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.310999 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311002 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311005 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311007 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311010 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311013 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311019 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311022 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:07.313229 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311024 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311027 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311030 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311032 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311035 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311038 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311040 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311043 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311047 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311050 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311053 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311057 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311060 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311062 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311066 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311068 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311071 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311074 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311076 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:07.313702 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311079 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311082 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311084 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311087 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311089 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311092 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311094 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311097 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311099 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311102 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311104 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311106 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311115 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.311117 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312240 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312248 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312255 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312260 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312264 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312267 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312272 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:17:07.314160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312276 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312280 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312282 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312286 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312289 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312293 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312296 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312299 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312302 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312305 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312308 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312311 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312315 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312317 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312321 2572 flags.go:64] FLAG: --config-dir="" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312323 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312326 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312330 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312333 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312336 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312339 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312342 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312344 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312347 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312354 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:17:07.314645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312357 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312361 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312365 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312368 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312370 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312374 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312376 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312381 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312384 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312387 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312390 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312393 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312397 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312399 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312402 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312406 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312409 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312412 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312415 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312417 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312420 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312423 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312426 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312429 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312432 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:17:07.315241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312435 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312439 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312442 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312445 2572 flags.go:64] FLAG: --help="false" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312448 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312451 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312454 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312457 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312461 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312465 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312467 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312470 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312473 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312476 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312478 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312481 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312484 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312487 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312490 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312493 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312495 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312498 2572 flags.go:64] FLAG: --lock-file="" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312501 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312504 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:17:07.315813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312506 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312511 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312514 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312517 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312520 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312522 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312525 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312528 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312531 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312536 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312539 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312542 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312545 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312548 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312551 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312554 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312557 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312560 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312563 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312571 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312574 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312577 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312580 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:17:07.316396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312583 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312588 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312591 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312594 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312597 2572 flags.go:64] FLAG: --port="10250" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312600 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312602 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05ebbf337afba188e" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312605 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312608 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312611 2572 flags.go:64] FLAG: --register-node="true" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312614 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312617 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312620 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312623 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312626 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312629 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312632 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312635 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312638 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312641 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312644 2572 flags.go:64] FLAG: --runonce="false" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312647 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312649 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312653 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312656 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312659 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:17:07.316921 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312662 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312666 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312669 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312671 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312674 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312677 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312680 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312683 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312686 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312689 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312694 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312697 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312700 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312704 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312707 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312710 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312713 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312716 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312719 2572 flags.go:64] FLAG: --v="2" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312723 2572 flags.go:64] FLAG: --version="false" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312726 2572 flags.go:64] FLAG: --vmodule="" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312730 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.312733 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312822 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:07.317561 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312826 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312830 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312832 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312835 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312838 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312840 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312843 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312846 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312849 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312852 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312855 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312857 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312860 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312862 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312865 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312868 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312870 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312873 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312875 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312878 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:07.318183 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312880 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312885 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312888 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312891 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312894 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312897 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312900 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312902 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312905 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312908 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312911 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312913 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312916 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312918 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312921 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312924 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312926 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312929 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312931 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312934 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:07.318668 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312936 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312941 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312944 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312947 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312950 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312952 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312955 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312957 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312960 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312963 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312965 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312968 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312970 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312973 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312975 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312978 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312980 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312982 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312996 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.312999 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:07.319154 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313002 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313004 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313007 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313009 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313013 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313016 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313018 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313021 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313023 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313026 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313028 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313031 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313034 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313037 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313040 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313043 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313046 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313048 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313051 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:07.319632 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313053 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313057 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313061 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313064 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313066 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.313069 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.313077 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.318890 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.318904 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318948 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318953 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318957 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318960 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318964 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318967 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318970 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:07.320108 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318973 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318975 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318978 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318981 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.318984 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319000 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319004 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319007 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319010 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319013 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319015 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319018 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319022 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319025 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319027 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319030 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319032 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319035 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319038 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319040 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:07.320492 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319043 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319046 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319049 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319051 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319055 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319058 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319060 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319063 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319065 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319068 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319071 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319073 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319076 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319079 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319081 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319084 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319087 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319089 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319092 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319095 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:07.320961 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319097 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319100 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319102 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319105 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319108 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319111 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319114 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319116 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319119 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319123 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319127 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319130 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319133 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319135 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319138 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319141 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319144 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319147 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319150 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:07.321475 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319153 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319155 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319158 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319161 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319163 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319168 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319171 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319174 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319177 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319179 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319182 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319185 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319187 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319190 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319193 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319195 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319198 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319200 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319203 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:07.321921 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319206 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.319212 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319301 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319305 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319308 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319311 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319314 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319317 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319320 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319323 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319326 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319329 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319332 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319335 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319337 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319340 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:07.322384 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319343 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319346 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319348 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319350 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319353 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319356 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319358 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319361 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319363 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319367 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319370 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319373 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319376 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319379 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319381 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319384 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319387 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319390 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319393 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:07.322847 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319395 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319398 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319401 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319403 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319406 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319408 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319411 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319413 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319416 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319418 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319421 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319423 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319426 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319428 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319431 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319433 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319436 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319439 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319441 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319444 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:07.323302 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319446 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319449 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319451 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319454 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319456 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319459 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319461 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319464 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319467 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319470 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319472 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319475 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319478 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319480 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319483 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319486 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319488 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319491 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319493 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319496 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:07.323773 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319498 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319500 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319503 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319505 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319508 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319512 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319515 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319518 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319520 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319523 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319526 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319529 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:07.319531 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.319536 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:07.324296 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.320344 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:17:07.324661 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.322629 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:17:07.324661 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.323379 2572 server.go:1019] "Starting client certificate rotation" Apr 16 18:17:07.324661 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.323468 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:07.324661 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.324058 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:07.346880 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.346865 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:07.350177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.350160 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:07.362685 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.362668 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:17:07.367779 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.367759 2572 log.go:25] "Validated CRI v1 image API" Apr 16 18:17:07.370744 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.370730 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:17:07.372781 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.372758 2572 fs.go:135] Filesystem UUIDs: map[13eee127-8ea1-4115-8e4a-bd12ed53a3df:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f97f232f-7e08-440a-984a-4254fcf132e0:/dev/nvme0n1p4] Apr 16 18:17:07.372854 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.372779 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:17:07.378068 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.377943 2572 manager.go:217] Machine: {Timestamp:2026-04-16 18:17:07.376339526 +0000 UTC m=+0.401382314 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199896 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec231eae5dcf34b211c7aa01a5bfab29 SystemUUID:ec231eae-5dcf-34b2-11c7-aa01a5bfab29 BootID:d6a4c49d-ecb7-411c-81b5-3ee421b9a1cd Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:82:25:6b:ee:03 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:82:25:6b:ee:03 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:ea:8e:62:e8:e6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:17:07.378068 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.378055 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:17:07.378219 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.378142 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:17:07.380586 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.380560 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:17:07.380756 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.380589 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-175.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:17:07.380839 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.380768 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:17:07.380839 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.380780 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:17:07.380839 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.380798 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:07.381456 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.381444 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:07.382952 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.382939 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:07.383096 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.383085 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:17:07.386339 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.386327 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:17:07.386399 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.386345 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:17:07.386399 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.386364 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:17:07.386399 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.386377 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:17:07.386399 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.386388 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:17:07.387241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.387228 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:07.387313 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.387250 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:07.388127 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.388108 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:07.390070 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.390049 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:17:07.392164 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.392151 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:17:07.393256 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393244 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:17:07.393299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393260 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:17:07.393299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393266 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:17:07.393299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393276 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:17:07.393299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393282 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:17:07.393299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393288 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:17:07.393299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393294 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:17:07.393299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393299 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:17:07.393469 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393306 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:17:07.393469 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393321 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:17:07.393469 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393342 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:17:07.393469 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.393395 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:17:07.394423 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.394411 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:17:07.394453 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.394425 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:17:07.398530 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.398502 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-175.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:17:07.398590 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.398541 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:17:07.398734 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.398718 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-175.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:17:07.399619 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.399576 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:17:07.399856 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.399844 2572 server.go:1295] "Started kubelet" Apr 16 18:17:07.400036 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.400006 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:17:07.400096 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.400024 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:17:07.400096 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.400080 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:17:07.400543 ip-10-0-138-175 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:17:07.401126 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.401100 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:17:07.402448 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.402436 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:17:07.406049 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406029 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:17:07.406145 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406060 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:07.406766 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406751 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:17:07.406844 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406754 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:17:07.406844 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406784 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:17:07.406844 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406841 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:17:07.407000 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406849 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:17:07.407000 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406870 2572 factory.go:55] Registering systemd factory Apr 16 18:17:07.407000 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.406922 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:17:07.407000 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.406965 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:07.407156 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.407144 2572 factory.go:153] Registering CRI-O factory Apr 16 18:17:07.407156 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.407155 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 18:17:07.407223 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.407209 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:17:07.407267 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.407232 2572 factory.go:103] Registering Raw factory Apr 16 18:17:07.407267 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.407245 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 18:17:07.407742 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.407726 2572 manager.go:319] Starting recovery of all containers Apr 16 18:17:07.407870 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.407844 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:17:07.413573 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.413546 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:17:07.413859 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.413833 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-175.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:17:07.414509 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.414485 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:17:07.414937 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.413588 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-175.ec2.internal.18a6e9202a264ed9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-175.ec2.internal,UID:ip-10-0-138-175.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-175.ec2.internal,},FirstTimestamp:2026-04-16 18:17:07.399618265 +0000 UTC m=+0.424661058,LastTimestamp:2026-04-16 18:17:07.399618265 +0000 UTC m=+0.424661058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-175.ec2.internal,}" Apr 16 18:17:07.415470 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.415450 2572 manager.go:324] Recovery completed Apr 16 18:17:07.422052 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.421947 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:07.424534 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.424518 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:07.424594 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.424545 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:07.424594 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.424555 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:07.425087 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.425069 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:17:07.425087 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.425084 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:17:07.425183 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.425102 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:07.426753 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.426699 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-175.ec2.internal.18a6e9202ba27c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-175.ec2.internal,UID:ip-10-0-138-175.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-175.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-175.ec2.internal,},FirstTimestamp:2026-04-16 18:17:07.424533654 +0000 UTC m=+0.449576445,LastTimestamp:2026-04-16 18:17:07.424533654 +0000 UTC m=+0.449576445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-175.ec2.internal,}" Apr 16 18:17:07.427184 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.427171 2572 policy_none.go:49] "None policy: Start" Apr 16 18:17:07.427230 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.427191 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:17:07.427230 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.427205 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:17:07.434727 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.434670 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-175.ec2.internal.18a6e9202ba2bd72 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-175.ec2.internal,UID:ip-10-0-138-175.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-138-175.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-138-175.ec2.internal,},FirstTimestamp:2026-04-16 18:17:07.424550258 +0000 UTC m=+0.449593050,LastTimestamp:2026-04-16 18:17:07.424550258 +0000 UTC m=+0.449593050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-175.ec2.internal,}" Apr 16 18:17:07.444887 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.444828 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-175.ec2.internal.18a6e9202ba2de58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-175.ec2.internal,UID:ip-10-0-138-175.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-138-175.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-138-175.ec2.internal,},FirstTimestamp:2026-04-16 18:17:07.42455868 +0000 UTC m=+0.449601472,LastTimestamp:2026-04-16 18:17:07.42455868 +0000 UTC m=+0.449601472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-175.ec2.internal,}" Apr 16 18:17:07.453719 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.453701 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-465pc" Apr 16 18:17:07.462156 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.462140 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-465pc" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.467066 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.467093 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.467102 2572 server.go:85] "Starting device plugin registration server" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.467297 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.467308 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.467378 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.467459 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.467468 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.467976 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:17:07.478189 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.468023 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:07.568226 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.568183 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:07.569092 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.569077 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:07.569161 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.569105 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:07.569161 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.569115 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:07.569161 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.569140 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.577661 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.577639 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:17:07.577661 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.577665 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:17:07.577770 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.577680 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:17:07.577770 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.577687 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:17:07.577770 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.577714 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:17:07.578093 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.578076 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.578186 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.578096 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-175.ec2.internal\": node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:07.581865 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.581850 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:07.625347 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.625327 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:07.678192 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.678169 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal"] Apr 16 18:17:07.678288 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.678242 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:07.680039 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.680024 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:07.680105 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.680051 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:07.680105 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.680061 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:07.681325 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.681311 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:07.681475 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.681458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.681525 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.681488 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:07.683323 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.683309 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:07.683412 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.683333 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:07.683412 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.683343 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:07.683412 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.683313 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:07.683412 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.683370 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:07.683412 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.683387 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:07.685084 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.685068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.685143 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.685098 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:07.685854 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.685836 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:07.685919 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.685872 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:07.685919 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.685888 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:07.708205 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.708180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7f865783273d649a9fd06761511c430-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal\" (UID: \"a7f865783273d649a9fd06761511c430\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.708285 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.708209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f1b0f110a831d4055d6fd2016cd3ab02-config\") pod \"kube-apiserver-proxy-ip-10-0-138-175.ec2.internal\" (UID: \"f1b0f110a831d4055d6fd2016cd3ab02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.708285 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.708226 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7f865783273d649a9fd06761511c430-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal\" (UID: \"a7f865783273d649a9fd06761511c430\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.713588 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.713574 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-175.ec2.internal\" not found" node="ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.717923 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.717904 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-175.ec2.internal\" not found" node="ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.725801 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.725783 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:07.809082 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.809054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f1b0f110a831d4055d6fd2016cd3ab02-config\") pod \"kube-apiserver-proxy-ip-10-0-138-175.ec2.internal\" (UID: \"f1b0f110a831d4055d6fd2016cd3ab02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.809167 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.809082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7f865783273d649a9fd06761511c430-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal\" (UID: \"a7f865783273d649a9fd06761511c430\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.809167 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.809102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7f865783273d649a9fd06761511c430-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal\" (UID: \"a7f865783273d649a9fd06761511c430\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.809167 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.809137 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7f865783273d649a9fd06761511c430-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal\" (UID: \"a7f865783273d649a9fd06761511c430\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.809260 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.809160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f1b0f110a831d4055d6fd2016cd3ab02-config\") pod \"kube-apiserver-proxy-ip-10-0-138-175.ec2.internal\" (UID: \"f1b0f110a831d4055d6fd2016cd3ab02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.809260 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:07.809181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7f865783273d649a9fd06761511c430-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal\" (UID: \"a7f865783273d649a9fd06761511c430\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:07.826201 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.826152 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:07.926998 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:07.926958 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:08.017434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.017414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:08.019957 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.019937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" Apr 16 18:17:08.027537 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:08.027515 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:08.128111 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:08.128056 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:08.228525 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:08.228502 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:08.277135 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.277111 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:08.323788 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.323766 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:17:08.324193 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.323881 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:08.324193 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.323892 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:08.328903 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:08.328883 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:08.406796 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.406755 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:08.418186 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.418167 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:08.429742 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:08.429724 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:08.448698 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.448678 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-svxbd" Apr 16 18:17:08.457720 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.457701 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-svxbd" Apr 16 18:17:08.463784 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.463759 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:12:07 +0000 UTC" deadline="2027-10-12 04:06:33.464679165 +0000 UTC" Apr 16 18:17:08.463784 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.463782 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13041h49m25.000900087s" Apr 16 18:17:08.505802 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:08.505775 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1b0f110a831d4055d6fd2016cd3ab02.slice/crio-c8b4b07aca6f4c18b5e7ecf97624af7142b11e2cd81c0c4261d199fc10fedd09 WatchSource:0}: Error finding container c8b4b07aca6f4c18b5e7ecf97624af7142b11e2cd81c0c4261d199fc10fedd09: Status 404 returned error can't find the container with id c8b4b07aca6f4c18b5e7ecf97624af7142b11e2cd81c0c4261d199fc10fedd09 Apr 16 18:17:08.506238 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:08.506217 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f865783273d649a9fd06761511c430.slice/crio-c88b6c55533b16d8c1ed0f29ea5149f61f5801af5316ebfb5eeea99089096b3a WatchSource:0}: Error finding container c88b6c55533b16d8c1ed0f29ea5149f61f5801af5316ebfb5eeea99089096b3a: Status 404 returned error can't find the container with id c88b6c55533b16d8c1ed0f29ea5149f61f5801af5316ebfb5eeea99089096b3a Apr 16 18:17:08.509697 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.509676 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:08.530522 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:08.530501 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-175.ec2.internal\" not found" Apr 16 18:17:08.580366 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.580328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" event={"ID":"f1b0f110a831d4055d6fd2016cd3ab02","Type":"ContainerStarted","Data":"c8b4b07aca6f4c18b5e7ecf97624af7142b11e2cd81c0c4261d199fc10fedd09"} Apr 16 18:17:08.581272 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.581254 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" event={"ID":"a7f865783273d649a9fd06761511c430","Type":"ContainerStarted","Data":"c88b6c55533b16d8c1ed0f29ea5149f61f5801af5316ebfb5eeea99089096b3a"} Apr 16 18:17:08.610561 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.610539 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:08.707084 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.707029 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" Apr 16 18:17:08.718721 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.718700 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:08.719479 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.719466 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" Apr 16 18:17:08.729312 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.729293 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:08.915045 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:08.914951 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:09.388037 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.388009 2572 apiserver.go:52] "Watching apiserver" Apr 16 18:17:09.394001 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.393959 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:17:09.394363 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.394331 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal","openshift-cluster-node-tuning-operator/tuned-4794j","openshift-image-registry/node-ca-v22jh","openshift-multus/multus-additional-cni-plugins-64mfc","openshift-multus/multus-xjsh6","openshift-network-diagnostics/network-check-target-rbqgb","openshift-network-operator/iptables-alerter-lwq9m","kube-system/konnectivity-agent-vwvb6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal","openshift-multus/network-metrics-daemon-kn4cv","openshift-ovn-kubernetes/ovnkube-node-8vpdh"] Apr 16 18:17:09.396806 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.396784 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.397048 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.397028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.398034 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.398015 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.399114 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-t4g87\"" Apr 16 18:17:09.399204 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399117 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:09.399204 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399128 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:17:09.399204 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399138 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:17:09.399363 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399141 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:17:09.399363 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399191 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zzn7w\"" Apr 16 18:17:09.399363 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399353 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:09.399602 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.399582 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.400163 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.400142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:17:09.400384 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.400368 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:17:09.400470 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.400393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:17:09.400724 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.400705 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:17:09.400831 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.400711 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:17:09.400901 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.400848 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-d4rm7\"" Apr 16 18:17:09.400901 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.400854 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:09.401024 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.400919 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:09.401766 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.401748 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:17:09.402096 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.402079 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.402193 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.402137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kfgpq\"" Apr 16 18:17:09.403462 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.403444 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.404129 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.404112 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:09.404311 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.404291 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5cb9n\"" Apr 16 18:17:09.404414 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.404375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:09.404601 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.404583 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:17:09.405182 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.405163 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.405912 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.405897 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:17:09.406437 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.406420 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:09.406519 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.406489 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:09.406519 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.406499 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:17:09.406633 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.406539 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nk5z8\"" Apr 16 18:17:09.407162 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.407141 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:17:09.407162 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.407157 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:17:09.407472 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.407441 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fwp8m\"" Apr 16 18:17:09.407472 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.407442 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:17:09.407978 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.407963 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.410315 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.410301 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pr74h\"" Apr 16 18:17:09.411116 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.411098 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:17:09.411116 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.411121 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:17:09.411322 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.411166 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:17:09.411550 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.411535 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:17:09.411636 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.411536 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:17:09.414735 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.413472 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:17:09.415965 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.415945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-cni-binary-copy\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.416056 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.415980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysctl-conf\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.416056 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-lib-modules\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.416056 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-etc-kubernetes\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.416191 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-node-log\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.416191 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovnkube-script-lib\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.416191 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.416191 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-var-lib-kubelet\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.416340 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-cni-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.416340 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416227 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/804384dc-65e2-43b1-adfc-3eed621ccd98-iptables-alerter-script\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.416340 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-slash\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.416340 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-tuned\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.416340 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dz2z\" (UniqueName: \"kubernetes.io/projected/e53b9f1d-b078-4852-a6ec-bcffd463c187-kube-api-access-9dz2z\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.416514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.416514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-cni-bin\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.416514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-os-release\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.416514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-kubernetes\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.416514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416481 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-registration-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.416514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-device-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416524 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovnkube-config\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-cni-multus\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416592 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-host\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416621 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-serviceca\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-daemon-config\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovn-node-metrics-cert\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpr2z\" (UniqueName: \"kubernetes.io/projected/711c83e1-762b-4a01-8f25-65c6c4407f6d-kube-api-access-vpr2z\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.416743 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-modprobe-d\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-k8s-cni-cncf-io\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-multus-certs\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416832 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-sys-fs\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-env-overrides\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416907 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysconfig\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1a4c046-40fc-4ba3-83df-94d70edbbba2-tmp\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.416972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-cnibin\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0f7718-67dc-4701-a6d9-a0f852eb4441-cni-binary-copy\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417086 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417080 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-socket-dir-parent\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417108 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-kubelet\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417131 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-var-lib-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-netns\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-ovn\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-cni-netd\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417266 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zskh\" (UniqueName: \"kubernetes.io/projected/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-kube-api-access-6zskh\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-systemd\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417306 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-kubelet\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417341 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/804384dc-65e2-43b1-adfc-3eed621ccd98-host-slash\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:09.417400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417404 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysctl-d\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-run\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-host\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417464 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq56c\" (UniqueName: \"kubernetes.io/projected/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-kube-api-access-xq56c\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-system-cni-dir\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417506 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-hostroot\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-conf-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrxz\" (UniqueName: \"kubernetes.io/projected/804384dc-65e2-43b1-adfc-3eed621ccd98-kube-api-access-mlrxz\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-run-ovn-kubernetes\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-system-cni-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-cni-bin\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/612b22f4-6ba0-49fb-8b66-2cff81a247be-konnectivity-ca\") pod \"konnectivity-agent-vwvb6\" (UID: \"612b22f4-6ba0-49fb-8b66-2cff81a247be\") " pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmb9c\" (UniqueName: \"kubernetes.io/projected/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-kube-api-access-tmb9c\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-sys\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/612b22f4-6ba0-49fb-8b66-2cff81a247be-agent-certs\") pod \"konnectivity-agent-vwvb6\" (UID: \"612b22f4-6ba0-49fb-8b66-2cff81a247be\") " pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.417864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-systemd-units\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-systemd\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-etc-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417905 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6l8\" (UniqueName: \"kubernetes.io/projected/f1a4c046-40fc-4ba3-83df-94d70edbbba2-kube-api-access-gx6l8\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-os-release\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq8gx\" (UniqueName: \"kubernetes.io/projected/6a0f7718-67dc-4701-a6d9-a0f852eb4441-kube-api-access-bq8gx\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.417971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-socket-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.418024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-cnibin\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.418052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.418068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-run-netns\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.418458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.418083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-log-socket\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.459236 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.459213 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:08 +0000 UTC" deadline="2028-01-19 20:43:06.517325562 +0000 UTC" Apr 16 18:17:09.459236 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.459233 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15434h25m57.058094333s" Apr 16 18:17:09.507652 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.507632 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:17:09.518396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpr2z\" (UniqueName: \"kubernetes.io/projected/711c83e1-762b-4a01-8f25-65c6c4407f6d-kube-api-access-vpr2z\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.518470 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-modprobe-d\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.518470 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-k8s-cni-cncf-io\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.518470 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-multus-certs\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.518575 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-sys-fs\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.518575 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-env-overrides\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.518575 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-k8s-cni-cncf-io\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.518575 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518524 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-multus-certs\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.518575 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518557 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-modprobe-d\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-sys-fs\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysconfig\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1a4c046-40fc-4ba3-83df-94d70edbbba2-tmp\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-cnibin\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0f7718-67dc-4701-a6d9-a0f852eb4441-cni-binary-copy\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-socket-dir-parent\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-kubelet\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.518729 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-cnibin\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518728 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-var-lib-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysconfig\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-socket-dir-parent\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518818 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-kubelet\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-var-lib-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-netns\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-ovn\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-cni-netd\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518930 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-run-netns\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518939 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-env-overrides\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.518972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-ovn\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-cni-netd\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zskh\" (UniqueName: \"kubernetes.io/projected/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-kube-api-access-6zskh\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-systemd\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-systemd\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-kubelet\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/804384dc-65e2-43b1-adfc-3eed621ccd98-host-slash\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysctl-d\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-kubelet\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/804384dc-65e2-43b1-adfc-3eed621ccd98-host-slash\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.519314 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-run\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0f7718-67dc-4701-a6d9-a0f852eb4441-cni-binary-copy\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysctl-d\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-host\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-run\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-host\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.519417 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:10.019367244 +0000 UTC m=+3.044410044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq56c\" (UniqueName: \"kubernetes.io/projected/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-kube-api-access-xq56c\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-system-cni-dir\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.519768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-hostroot\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-system-cni-dir\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-conf-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-conf-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-hostroot\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrxz\" (UniqueName: \"kubernetes.io/projected/804384dc-65e2-43b1-adfc-3eed621ccd98-kube-api-access-mlrxz\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-run-ovn-kubernetes\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-system-cni-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-cni-bin\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-run-ovn-kubernetes\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/612b22f4-6ba0-49fb-8b66-2cff81a247be-konnectivity-ca\") pod \"konnectivity-agent-vwvb6\" (UID: \"612b22f4-6ba0-49fb-8b66-2cff81a247be\") " pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-system-cni-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmb9c\" (UniqueName: \"kubernetes.io/projected/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-kube-api-access-tmb9c\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-cni-bin\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-sys\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.520426 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/612b22f4-6ba0-49fb-8b66-2cff81a247be-agent-certs\") pod \"konnectivity-agent-vwvb6\" (UID: \"612b22f4-6ba0-49fb-8b66-2cff81a247be\") " pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-sys\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-systemd-units\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-systemd\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-etc-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.519976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-systemd-units\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-systemd\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-etc-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/612b22f4-6ba0-49fb-8b66-2cff81a247be-konnectivity-ca\") pod \"konnectivity-agent-vwvb6\" (UID: \"612b22f4-6ba0-49fb-8b66-2cff81a247be\") " pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx6l8\" (UniqueName: \"kubernetes.io/projected/f1a4c046-40fc-4ba3-83df-94d70edbbba2-kube-api-access-gx6l8\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-os-release\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq8gx\" (UniqueName: \"kubernetes.io/projected/6a0f7718-67dc-4701-a6d9-a0f852eb4441-kube-api-access-bq8gx\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520530 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-socket-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-cnibin\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.521111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-os-release\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-run-netns\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-log-socket\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-socket-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-cni-binary-copy\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysctl-conf\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520664 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-cnibin\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-lib-modules\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-run-netns\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-etc-kubernetes\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-node-log\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-etc-kubernetes\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovnkube-script-lib\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-var-lib-kubelet\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-cni-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.521843 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/804384dc-65e2-43b1-adfc-3eed621ccd98-iptables-alerter-script\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-slash\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-node-log\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-lib-modules\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-tuned\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dz2z\" (UniqueName: \"kubernetes.io/projected/e53b9f1d-b078-4852-a6ec-bcffd463c187-kube-api-access-9dz2z\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-cni-bin\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-os-release\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-kubernetes\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-registration-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-device-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovnkube-config\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-cni-binary-copy\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-cni-multus\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-log-socket\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.522926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-run-openvswitch\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-registration-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e53b9f1d-b078-4852-a6ec-bcffd463c187-device-dir\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-cni-bin\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/711c83e1-762b-4a01-8f25-65c6c4407f6d-os-release\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521372 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovnkube-script-lib\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-host\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-serviceca\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-host-slash\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-daemon-config\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/711c83e1-762b-4a01-8f25-65c6c4407f6d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovn-node-metrics-cert\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-kubernetes\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-var-lib-kubelet\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.520865 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-sysctl-conf\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-host-var-lib-cni-multus\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovnkube-config\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-host\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.523434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.521957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-serviceca\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.524005 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.522078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-daemon-config\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.524005 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.522094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/804384dc-65e2-43b1-adfc-3eed621ccd98-iptables-alerter-script\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.524005 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.522129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0f7718-67dc-4701-a6d9-a0f852eb4441-multus-cni-dir\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.524005 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.522352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1a4c046-40fc-4ba3-83df-94d70edbbba2-tmp\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.524005 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.522600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/612b22f4-6ba0-49fb-8b66-2cff81a247be-agent-certs\") pod \"konnectivity-agent-vwvb6\" (UID: \"612b22f4-6ba0-49fb-8b66-2cff81a247be\") " pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.524005 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.523178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1a4c046-40fc-4ba3-83df-94d70edbbba2-etc-tuned\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.524005 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.523833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-ovn-node-metrics-cert\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.529709 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.529691 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:09.529709 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.529710 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:09.529848 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.529725 2572 projected.go:194] Error preparing data for projected volume kube-api-access-vlg85 for pod openshift-network-diagnostics/network-check-target-rbqgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:09.529848 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:09.529775 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85 podName:f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:10.029759 +0000 UTC m=+3.054801781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vlg85" (UniqueName: "kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85") pod "network-check-target-rbqgb" (UID: "f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:09.531350 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.531324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrxz\" (UniqueName: \"kubernetes.io/projected/804384dc-65e2-43b1-adfc-3eed621ccd98-kube-api-access-mlrxz\") pod \"iptables-alerter-lwq9m\" (UID: \"804384dc-65e2-43b1-adfc-3eed621ccd98\") " pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.532027 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.531981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zskh\" (UniqueName: \"kubernetes.io/projected/8c9b6947-3fb0-4af0-baf5-9af029c0ab42-kube-api-access-6zskh\") pod \"ovnkube-node-8vpdh\" (UID: \"8c9b6947-3fb0-4af0-baf5-9af029c0ab42\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:09.532313 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.532289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpr2z\" (UniqueName: \"kubernetes.io/projected/711c83e1-762b-4a01-8f25-65c6c4407f6d-kube-api-access-vpr2z\") pod \"multus-additional-cni-plugins-64mfc\" (UID: \"711c83e1-762b-4a01-8f25-65c6c4407f6d\") " pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.533731 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.533318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq56c\" (UniqueName: \"kubernetes.io/projected/ebe5bfb0-23f3-4c66-9cc1-2436ea624b37-kube-api-access-xq56c\") pod \"node-ca-v22jh\" (UID: \"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37\") " pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.533731 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.533666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmb9c\" (UniqueName: \"kubernetes.io/projected/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-kube-api-access-tmb9c\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:09.533882 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.533755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx6l8\" (UniqueName: \"kubernetes.io/projected/f1a4c046-40fc-4ba3-83df-94d70edbbba2-kube-api-access-gx6l8\") pod \"tuned-4794j\" (UID: \"f1a4c046-40fc-4ba3-83df-94d70edbbba2\") " pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.536267 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.534304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dz2z\" (UniqueName: \"kubernetes.io/projected/e53b9f1d-b078-4852-a6ec-bcffd463c187-kube-api-access-9dz2z\") pod \"aws-ebs-csi-driver-node-f5fnx\" (UID: \"e53b9f1d-b078-4852-a6ec-bcffd463c187\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.536267 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.534485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq8gx\" (UniqueName: \"kubernetes.io/projected/6a0f7718-67dc-4701-a6d9-a0f852eb4441-kube-api-access-bq8gx\") pod \"multus-xjsh6\" (UID: \"6a0f7718-67dc-4701-a6d9-a0f852eb4441\") " pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.698278 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.698212 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:09.709353 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.709329 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4794j" Apr 16 18:17:09.715980 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.715962 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v22jh" Apr 16 18:17:09.723551 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.723534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-64mfc" Apr 16 18:17:09.729088 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.729072 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xjsh6" Apr 16 18:17:09.735553 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.735537 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lwq9m" Apr 16 18:17:09.741795 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.741779 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:09.749040 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.749023 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" Apr 16 18:17:09.753549 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:09.753534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:10.024599 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.024573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:10.024733 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.024694 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:10.024769 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.024748 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.024732152 +0000 UTC m=+4.049774951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:10.048224 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.048130 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c9b6947_3fb0_4af0_baf5_9af029c0ab42.slice/crio-6226b8a6f1df0e95e992d39b5169503c59ddfd8f9101dbf4e4df93b71897133b WatchSource:0}: Error finding container 6226b8a6f1df0e95e992d39b5169503c59ddfd8f9101dbf4e4df93b71897133b: Status 404 returned error can't find the container with id 6226b8a6f1df0e95e992d39b5169503c59ddfd8f9101dbf4e4df93b71897133b Apr 16 18:17:10.050571 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.050482 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804384dc_65e2_43b1_adfc_3eed621ccd98.slice/crio-2ba8425626d53b6412919fc6a496fa9df8aba9110b9a3f16cbb9bf3be331117f WatchSource:0}: Error finding container 2ba8425626d53b6412919fc6a496fa9df8aba9110b9a3f16cbb9bf3be331117f: Status 404 returned error can't find the container with id 2ba8425626d53b6412919fc6a496fa9df8aba9110b9a3f16cbb9bf3be331117f Apr 16 18:17:10.051911 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.051880 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0f7718_67dc_4701_a6d9_a0f852eb4441.slice/crio-063e4d0f00a849b3a64fd96649169a4b83dddb0295e3d9628fc2bd5b75fd8a63 WatchSource:0}: Error finding container 063e4d0f00a849b3a64fd96649169a4b83dddb0295e3d9628fc2bd5b75fd8a63: Status 404 returned error can't find the container with id 063e4d0f00a849b3a64fd96649169a4b83dddb0295e3d9628fc2bd5b75fd8a63 Apr 16 18:17:10.052723 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.052700 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711c83e1_762b_4a01_8f25_65c6c4407f6d.slice/crio-73c5f7f19cad2b076e96c9b7dd093537cc1ccd175410b378bf2ce547d87645bf WatchSource:0}: Error finding container 73c5f7f19cad2b076e96c9b7dd093537cc1ccd175410b378bf2ce547d87645bf: Status 404 returned error can't find the container with id 73c5f7f19cad2b076e96c9b7dd093537cc1ccd175410b378bf2ce547d87645bf Apr 16 18:17:10.053568 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.053540 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a4c046_40fc_4ba3_83df_94d70edbbba2.slice/crio-48925a051212e44cbe5096c9d9094ef8119730777b75036bc3a95d3d3eff6269 WatchSource:0}: Error finding container 48925a051212e44cbe5096c9d9094ef8119730777b75036bc3a95d3d3eff6269: Status 404 returned error can't find the container with id 48925a051212e44cbe5096c9d9094ef8119730777b75036bc3a95d3d3eff6269 Apr 16 18:17:10.054499 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.054475 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode53b9f1d_b078_4852_a6ec_bcffd463c187.slice/crio-5b29b080197e88eb1fe33f51b331146991b974ddf940a98913e8d2e94f884545 WatchSource:0}: Error finding container 5b29b080197e88eb1fe33f51b331146991b974ddf940a98913e8d2e94f884545: Status 404 returned error can't find the container with id 5b29b080197e88eb1fe33f51b331146991b974ddf940a98913e8d2e94f884545 Apr 16 18:17:10.055791 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.055769 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebe5bfb0_23f3_4c66_9cc1_2436ea624b37.slice/crio-5b832fe008e710bf0d678224c4b6104ce4432ac4a7a4f9da1dac6f38db9b7407 WatchSource:0}: Error finding container 5b832fe008e710bf0d678224c4b6104ce4432ac4a7a4f9da1dac6f38db9b7407: Status 404 returned error can't find the container with id 5b832fe008e710bf0d678224c4b6104ce4432ac4a7a4f9da1dac6f38db9b7407 Apr 16 18:17:10.057757 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:10.057734 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod612b22f4_6ba0_49fb_8b66_2cff81a247be.slice/crio-c8a7e3e913af98cd70a58e93016479bebec9db3eff0de3f8f165912efcb2429c WatchSource:0}: Error finding container c8a7e3e913af98cd70a58e93016479bebec9db3eff0de3f8f165912efcb2429c: Status 404 returned error can't find the container with id c8a7e3e913af98cd70a58e93016479bebec9db3eff0de3f8f165912efcb2429c Apr 16 18:17:10.125141 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.125017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:10.125216 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.125124 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:10.125216 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.125198 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:10.125216 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.125209 2572 projected.go:194] Error preparing data for projected volume kube-api-access-vlg85 for pod openshift-network-diagnostics/network-check-target-rbqgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:10.125323 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.125251 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85 podName:f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.125236905 +0000 UTC m=+4.150279683 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vlg85" (UniqueName: "kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85") pod "network-check-target-rbqgb" (UID: "f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:10.374752 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.374654 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:10.460816 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.460310 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:08 +0000 UTC" deadline="2028-01-13 01:14:35.425354984 +0000 UTC" Apr 16 18:17:10.460816 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.460339 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15270h57m24.965019954s" Apr 16 18:17:10.578402 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.578373 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:10.578576 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.578507 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:10.578951 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.578914 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:10.579052 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:10.579021 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:10.596238 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.595923 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vwvb6" event={"ID":"612b22f4-6ba0-49fb-8b66-2cff81a247be","Type":"ContainerStarted","Data":"c8a7e3e913af98cd70a58e93016479bebec9db3eff0de3f8f165912efcb2429c"} Apr 16 18:17:10.598794 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.598768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" event={"ID":"e53b9f1d-b078-4852-a6ec-bcffd463c187","Type":"ContainerStarted","Data":"5b29b080197e88eb1fe33f51b331146991b974ddf940a98913e8d2e94f884545"} Apr 16 18:17:10.601979 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.601954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4794j" event={"ID":"f1a4c046-40fc-4ba3-83df-94d70edbbba2","Type":"ContainerStarted","Data":"48925a051212e44cbe5096c9d9094ef8119730777b75036bc3a95d3d3eff6269"} Apr 16 18:17:10.604732 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.604709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjsh6" event={"ID":"6a0f7718-67dc-4701-a6d9-a0f852eb4441","Type":"ContainerStarted","Data":"063e4d0f00a849b3a64fd96649169a4b83dddb0295e3d9628fc2bd5b75fd8a63"} Apr 16 18:17:10.609673 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.609651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lwq9m" event={"ID":"804384dc-65e2-43b1-adfc-3eed621ccd98","Type":"ContainerStarted","Data":"2ba8425626d53b6412919fc6a496fa9df8aba9110b9a3f16cbb9bf3be331117f"} Apr 16 18:17:10.626658 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.626240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" event={"ID":"f1b0f110a831d4055d6fd2016cd3ab02","Type":"ContainerStarted","Data":"b3f5512d391f2ac32101fa17b59791ab5961d27d9bade439e32ac804709c96d3"} Apr 16 18:17:10.629645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.629615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerStarted","Data":"73c5f7f19cad2b076e96c9b7dd093537cc1ccd175410b378bf2ce547d87645bf"} Apr 16 18:17:10.632386 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.632347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v22jh" event={"ID":"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37","Type":"ContainerStarted","Data":"5b832fe008e710bf0d678224c4b6104ce4432ac4a7a4f9da1dac6f38db9b7407"} Apr 16 18:17:10.644052 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:10.644019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"6226b8a6f1df0e95e992d39b5169503c59ddfd8f9101dbf4e4df93b71897133b"} Apr 16 18:17:11.032980 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:11.032950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:11.033128 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:11.033097 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:11.033240 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:11.033226 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:13.033131945 +0000 UTC m=+6.058174739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:11.135077 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:11.133602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:11.135077 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:11.133771 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:11.135077 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:11.133790 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:11.135077 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:11.133802 2572 projected.go:194] Error preparing data for projected volume kube-api-access-vlg85 for pod openshift-network-diagnostics/network-check-target-rbqgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:11.135077 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:11.133853 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85 podName:f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:13.133836433 +0000 UTC m=+6.158879214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vlg85" (UniqueName: "kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85") pod "network-check-target-rbqgb" (UID: "f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:11.664287 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:11.664253 2572 generic.go:358] "Generic (PLEG): container finished" podID="a7f865783273d649a9fd06761511c430" containerID="4f4cec9200a021aedc570959c114117008bcc14b0d8b7a0eabdaeab3b575f3ae" exitCode=0 Apr 16 18:17:11.664708 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:11.664351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" event={"ID":"a7f865783273d649a9fd06761511c430","Type":"ContainerDied","Data":"4f4cec9200a021aedc570959c114117008bcc14b0d8b7a0eabdaeab3b575f3ae"} Apr 16 18:17:11.677174 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:11.677122 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-175.ec2.internal" podStartSLOduration=3.6771051850000003 podStartE2EDuration="3.677105185s" podCreationTimestamp="2026-04-16 18:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:10.645367205 +0000 UTC m=+3.670410005" watchObservedRunningTime="2026-04-16 18:17:11.677105185 +0000 UTC m=+4.702147967" Apr 16 18:17:12.578910 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:12.578837 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:12.579124 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:12.578967 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:12.579383 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:12.579365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:12.579498 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:12.579455 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:12.669697 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:12.669645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" event={"ID":"a7f865783273d649a9fd06761511c430","Type":"ContainerStarted","Data":"b1afdeddc4f65f4efa0a2c74fb29c7617a77a150d5526addb84e2c3329f2dd6e"} Apr 16 18:17:13.048828 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:13.048754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:13.048975 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:13.048899 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:13.048975 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:13.048961 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:17.048947535 +0000 UTC m=+10.073990316 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:13.149274 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:13.149241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:13.149438 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:13.149405 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:13.149438 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:13.149422 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:13.149438 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:13.149434 2572 projected.go:194] Error preparing data for projected volume kube-api-access-vlg85 for pod openshift-network-diagnostics/network-check-target-rbqgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:13.149602 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:13.149494 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85 podName:f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:17.149477359 +0000 UTC m=+10.174520154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vlg85" (UniqueName: "kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85") pod "network-check-target-rbqgb" (UID: "f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:14.578652 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:14.578578 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:14.579125 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:14.578697 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:14.579125 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:14.578589 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:14.579125 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:14.578848 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:16.578699 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:16.578660 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:16.579205 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:16.578729 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:16.579205 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:16.578827 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:16.579205 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:16.578926 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:17.078712 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:17.078625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:17.078872 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:17.078785 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:17.078872 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:17.078851 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:25.078832133 +0000 UTC m=+18.103874913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:17.179332 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:17.179292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:17.179501 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:17.179474 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:17.179501 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:17.179493 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:17.179603 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:17.179504 2572 projected.go:194] Error preparing data for projected volume kube-api-access-vlg85 for pod openshift-network-diagnostics/network-check-target-rbqgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:17.179603 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:17.179559 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85 podName:f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:25.179540204 +0000 UTC m=+18.204583003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vlg85" (UniqueName: "kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85") pod "network-check-target-rbqgb" (UID: "f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:18.578709 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:18.578679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:18.578709 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:18.578697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:18.579117 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:18.578804 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:18.579117 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:18.578920 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:20.577916 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:20.577883 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:20.578345 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:20.577883 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:20.578345 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:20.578010 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:20.578345 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:20.578096 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:21.798742 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:21.798697 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-175.ec2.internal" podStartSLOduration=13.79868386 podStartE2EDuration="13.79868386s" podCreationTimestamp="2026-04-16 18:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:12.683201726 +0000 UTC m=+5.708244519" watchObservedRunningTime="2026-04-16 18:17:21.79868386 +0000 UTC m=+14.823726671" Apr 16 18:17:21.799313 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:21.799296 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wh7d5"] Apr 16 18:17:21.848866 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:21.848842 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:21.849018 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:21.848907 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:21.915343 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:21.915318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8cd9794f-105a-4fbf-ae0d-7f399cb33595-dbus\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:21.915459 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:21.915360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:21.915459 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:21.915388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8cd9794f-105a-4fbf-ae0d-7f399cb33595-kubelet-config\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:22.016144 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.016117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8cd9794f-105a-4fbf-ae0d-7f399cb33595-dbus\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:22.016280 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.016173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:22.016280 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.016213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8cd9794f-105a-4fbf-ae0d-7f399cb33595-kubelet-config\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:22.016383 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.016289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8cd9794f-105a-4fbf-ae0d-7f399cb33595-kubelet-config\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:22.016383 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.016333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8cd9794f-105a-4fbf-ae0d-7f399cb33595-dbus\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:22.016383 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:22.016358 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:22.016496 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:22.016414 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret podName:8cd9794f-105a-4fbf-ae0d-7f399cb33595 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:22.516400055 +0000 UTC m=+15.541442832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret") pod "global-pull-secret-syncer-wh7d5" (UID: "8cd9794f-105a-4fbf-ae0d-7f399cb33595") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:22.521769 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.521737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:22.521916 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:22.521892 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:22.521971 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:22.521956 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret podName:8cd9794f-105a-4fbf-ae0d-7f399cb33595 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:23.521937041 +0000 UTC m=+16.546979829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret") pod "global-pull-secret-syncer-wh7d5" (UID: "8cd9794f-105a-4fbf-ae0d-7f399cb33595") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:22.578273 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.578249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:22.578436 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:22.578250 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:22.578436 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:22.578348 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:22.578552 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:22.578459 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:23.528194 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:23.528163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:23.528575 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:23.528298 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:23.528575 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:23.528360 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret podName:8cd9794f-105a-4fbf-ae0d-7f399cb33595 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:25.528339741 +0000 UTC m=+18.553382519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret") pod "global-pull-secret-syncer-wh7d5" (UID: "8cd9794f-105a-4fbf-ae0d-7f399cb33595") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:23.578185 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:23.578159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:23.578320 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:23.578293 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:24.577835 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:24.577805 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:24.577835 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:24.577841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:24.578375 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:24.577942 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:24.578375 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:24.578091 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:25.142293 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:25.142262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:25.142520 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.142416 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:25.142520 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.142478 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.142461916 +0000 UTC m=+34.167504695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:25.242721 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:25.242680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:25.242885 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.242838 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:25.242885 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.242864 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:25.242885 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.242878 2572 projected.go:194] Error preparing data for projected volume kube-api-access-vlg85 for pod openshift-network-diagnostics/network-check-target-rbqgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:25.243063 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.242939 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85 podName:f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.242918492 +0000 UTC m=+34.267961292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vlg85" (UniqueName: "kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85") pod "network-check-target-rbqgb" (UID: "f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:25.544788 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:25.544746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:25.544971 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.544931 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:25.545054 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.545017 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret podName:8cd9794f-105a-4fbf-ae0d-7f399cb33595 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:29.544982519 +0000 UTC m=+22.570025295 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret") pod "global-pull-secret-syncer-wh7d5" (UID: "8cd9794f-105a-4fbf-ae0d-7f399cb33595") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:25.578905 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:25.578884 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:25.579241 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:25.578967 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:26.578750 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:26.578724 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:26.578874 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:26.578854 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:26.579197 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:26.579170 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:26.579543 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:26.579269 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:26.700386 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:26.699205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"1ae30f9210f4cb8d1115817ee436f1d81b0fa24b4901ce48db248ad807eb2312"} Apr 16 18:17:26.703437 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:26.703386 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4794j" event={"ID":"f1a4c046-40fc-4ba3-83df-94d70edbbba2","Type":"ContainerStarted","Data":"836a796ee011d33cede338b9e8d429b1f4670d02b85b4917b3562e962c4df99e"} Apr 16 18:17:26.747235 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:26.746919 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4794j" podStartSLOduration=3.326906959 podStartE2EDuration="19.746901235s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.056358754 +0000 UTC m=+3.081401532" lastFinishedPulling="2026-04-16 18:17:26.476353018 +0000 UTC m=+19.501395808" observedRunningTime="2026-04-16 18:17:26.724750208 +0000 UTC m=+19.749793005" watchObservedRunningTime="2026-04-16 18:17:26.746901235 +0000 UTC m=+19.771944033" Apr 16 18:17:26.747344 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:26.747237 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xjsh6" podStartSLOduration=3.19978905 podStartE2EDuration="19.747227371s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.053722241 +0000 UTC m=+3.078765020" lastFinishedPulling="2026-04-16 18:17:26.601160559 +0000 UTC m=+19.626203341" observedRunningTime="2026-04-16 18:17:26.746503919 +0000 UTC m=+19.771546718" watchObservedRunningTime="2026-04-16 18:17:26.747227371 +0000 UTC m=+19.772270172" Apr 16 18:17:27.068705 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.068529 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xxr9z"] Apr 16 18:17:27.071221 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.071198 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.073649 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.073458 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:17:27.073649 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.073587 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:17:27.073649 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.073589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-824f6\"" Apr 16 18:17:27.158811 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.158781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh4mw\" (UniqueName: \"kubernetes.io/projected/dc04a45e-8316-4c0a-86a5-12c986bd0756-kube-api-access-qh4mw\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.158925 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.158859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc04a45e-8316-4c0a-86a5-12c986bd0756-hosts-file\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.158925 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.158897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc04a45e-8316-4c0a-86a5-12c986bd0756-tmp-dir\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.260015 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.259931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc04a45e-8316-4c0a-86a5-12c986bd0756-hosts-file\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.260015 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.259967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc04a45e-8316-4c0a-86a5-12c986bd0756-tmp-dir\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.260193 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.260067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc04a45e-8316-4c0a-86a5-12c986bd0756-hosts-file\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.260193 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.260150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4mw\" (UniqueName: \"kubernetes.io/projected/dc04a45e-8316-4c0a-86a5-12c986bd0756-kube-api-access-qh4mw\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.260279 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.260220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc04a45e-8316-4c0a-86a5-12c986bd0756-tmp-dir\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.327105 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.327078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4mw\" (UniqueName: \"kubernetes.io/projected/dc04a45e-8316-4c0a-86a5-12c986bd0756-kube-api-access-qh4mw\") pod \"node-resolver-xxr9z\" (UID: \"dc04a45e-8316-4c0a-86a5-12c986bd0756\") " pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.380060 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.380038 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xxr9z" Apr 16 18:17:27.463532 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:27.463499 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc04a45e_8316_4c0a_86a5_12c986bd0756.slice/crio-281d08e8a89fe35641a53536b2dfc7b0328de7afd59b638e7d968fe05e0d74e2 WatchSource:0}: Error finding container 281d08e8a89fe35641a53536b2dfc7b0328de7afd59b638e7d968fe05e0d74e2: Status 404 returned error can't find the container with id 281d08e8a89fe35641a53536b2dfc7b0328de7afd59b638e7d968fe05e0d74e2 Apr 16 18:17:27.578703 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.578677 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:27.578816 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:27.578783 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:27.625909 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.625890 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:17:27.723073 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.723041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vwvb6" event={"ID":"612b22f4-6ba0-49fb-8b66-2cff81a247be","Type":"ContainerStarted","Data":"bb4ecb4231f8311ae20684de5424a28135d2addd25013ec16d53680bcf34545b"} Apr 16 18:17:27.724532 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.724511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" event={"ID":"e53b9f1d-b078-4852-a6ec-bcffd463c187","Type":"ContainerStarted","Data":"ccadcce1f5cd82af1bea944f1da25022fb344d7ceb852573507ffa9bce2875ce"} Apr 16 18:17:27.724532 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.724534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" event={"ID":"e53b9f1d-b078-4852-a6ec-bcffd463c187","Type":"ContainerStarted","Data":"35dc936321196dea891e53e36296b92e30cbbffbc83512d0e47a43dbeaba8147"} Apr 16 18:17:27.725562 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.725543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjsh6" event={"ID":"6a0f7718-67dc-4701-a6d9-a0f852eb4441","Type":"ContainerStarted","Data":"981c9660a36af7623d7af7df23bd02d32ab78f9f839a5fe62286a3aa49f7487c"} Apr 16 18:17:27.726741 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.726714 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xxr9z" event={"ID":"dc04a45e-8316-4c0a-86a5-12c986bd0756","Type":"ContainerStarted","Data":"80a8dc09493daaf46dbb0e7fcd5d98269f5681f34c809bd5d9999b3f18191e3b"} Apr 16 18:17:27.726741 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.726736 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xxr9z" event={"ID":"dc04a45e-8316-4c0a-86a5-12c986bd0756","Type":"ContainerStarted","Data":"281d08e8a89fe35641a53536b2dfc7b0328de7afd59b638e7d968fe05e0d74e2"} Apr 16 18:17:27.727961 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.727941 2572 generic.go:358] "Generic (PLEG): container finished" podID="711c83e1-762b-4a01-8f25-65c6c4407f6d" containerID="41c5052b21e73ef8098978b684d2760bd67fa6884d8acbece09925f9f00bdcac" exitCode=0 Apr 16 18:17:27.728053 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.728017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerDied","Data":"41c5052b21e73ef8098978b684d2760bd67fa6884d8acbece09925f9f00bdcac"} Apr 16 18:17:27.729257 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.729239 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v22jh" event={"ID":"ebe5bfb0-23f3-4c66-9cc1-2436ea624b37","Type":"ContainerStarted","Data":"58d4eab46a0bb38a60eb67892d1e63c2a1c83b9052e61922093f14c5a7b58593"} Apr 16 18:17:27.732440 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.732424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:17:27.732827 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.732807 2572 generic.go:358] "Generic (PLEG): container finished" podID="8c9b6947-3fb0-4af0-baf5-9af029c0ab42" containerID="6dde65cb0f116bff58c7075e2bad6fef43269616a23acd646483ff75eacfa2ca" exitCode=1 Apr 16 18:17:27.732917 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.732876 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"6b79a4766ce1eeb25e9debb575bc2b1c118a3c1ba213ff542187fa925d24bb3c"} Apr 16 18:17:27.732917 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.732898 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"83bdc4a71bde7d7d711e6ad450672901c5917cc92e7b2ff2ae711b90bb92a899"} Apr 16 18:17:27.732917 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.732907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"5d1f3dcfc9a690c22c21de5cd4d34cf52587c15785069b1b6d26627c3d778d55"} Apr 16 18:17:27.732917 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.732916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"34b9d9836f781fa609411f01ab340082cc489a4ebe8a069434624be0e196a524"} Apr 16 18:17:27.733100 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.732924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerDied","Data":"6dde65cb0f116bff58c7075e2bad6fef43269616a23acd646483ff75eacfa2ca"} Apr 16 18:17:27.754795 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.754760 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v22jh" podStartSLOduration=4.36282814 podStartE2EDuration="20.754750263s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.059948383 +0000 UTC m=+3.084991176" lastFinishedPulling="2026-04-16 18:17:26.451870519 +0000 UTC m=+19.476913299" observedRunningTime="2026-04-16 18:17:27.754523398 +0000 UTC m=+20.779566200" watchObservedRunningTime="2026-04-16 18:17:27.754750263 +0000 UTC m=+20.779793061" Apr 16 18:17:27.755034 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.755012 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vwvb6" podStartSLOduration=4.363698017 podStartE2EDuration="20.755007635s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.060483085 +0000 UTC m=+3.085525865" lastFinishedPulling="2026-04-16 18:17:26.451792698 +0000 UTC m=+19.476835483" observedRunningTime="2026-04-16 18:17:27.739577099 +0000 UTC m=+20.764619896" watchObservedRunningTime="2026-04-16 18:17:27.755007635 +0000 UTC m=+20.780050424" Apr 16 18:17:27.791810 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:27.791746 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xxr9z" podStartSLOduration=0.791736963 podStartE2EDuration="791.736963ms" podCreationTimestamp="2026-04-16 18:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:27.769461217 +0000 UTC m=+20.794504026" watchObservedRunningTime="2026-04-16 18:17:27.791736963 +0000 UTC m=+20.816779761" Apr 16 18:17:28.480033 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.479897 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:17:27.625903702Z","UUID":"3e6662c1-d742-4070-8c0e-503c87bee213","Handler":null,"Name":"","Endpoint":""} Apr 16 18:17:28.483571 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.483551 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:17:28.483672 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.483579 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:17:28.578516 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.578485 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:28.578516 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.578513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:28.578680 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:28.578613 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:28.578768 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:28.578742 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:28.736583 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.736552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" event={"ID":"e53b9f1d-b078-4852-a6ec-bcffd463c187","Type":"ContainerStarted","Data":"f5ac2b142b91322fc7e3abec9c99530dade823e5a73056a6664dd8a07086cc1f"} Apr 16 18:17:28.738222 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.738197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lwq9m" event={"ID":"804384dc-65e2-43b1-adfc-3eed621ccd98","Type":"ContainerStarted","Data":"208cb44342b0773afafeebb22d12d99cb801bffa6a01b743dad8b1bc06770415"} Apr 16 18:17:28.754665 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.754620 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5fnx" podStartSLOduration=3.433542142 podStartE2EDuration="21.754607628s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.059836289 +0000 UTC m=+3.084879079" lastFinishedPulling="2026-04-16 18:17:28.380901788 +0000 UTC m=+21.405944565" observedRunningTime="2026-04-16 18:17:28.754524899 +0000 UTC m=+21.779567695" watchObservedRunningTime="2026-04-16 18:17:28.754607628 +0000 UTC m=+21.779650427" Apr 16 18:17:28.769311 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:28.769272 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lwq9m" podStartSLOduration=5.3454357550000005 podStartE2EDuration="21.769263203s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.052526505 +0000 UTC m=+3.077569284" lastFinishedPulling="2026-04-16 18:17:26.476353943 +0000 UTC m=+19.501396732" observedRunningTime="2026-04-16 18:17:28.768800271 +0000 UTC m=+21.793843070" watchObservedRunningTime="2026-04-16 18:17:28.769263203 +0000 UTC m=+21.794305996" Apr 16 18:17:29.578299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:29.578107 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:29.578510 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:29.578394 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:29.578876 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:29.578844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:29.579029 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:29.578977 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:29.579084 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:29.579055 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret podName:8cd9794f-105a-4fbf-ae0d-7f399cb33595 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:37.579037012 +0000 UTC m=+30.604079789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret") pod "global-pull-secret-syncer-wh7d5" (UID: "8cd9794f-105a-4fbf-ae0d-7f399cb33595") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:29.743043 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:29.743015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:17:29.743428 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:29.743400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"e8a8f7ff8ed9ce9b6ecf308b0311aa40d8c4baf36fc22f947c7efec46a8b447b"} Apr 16 18:17:30.041067 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:30.041022 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:30.045447 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:30.045423 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:30.578353 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:30.578327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:30.578533 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:30.578327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:30.578533 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:30.578461 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:30.578624 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:30.578531 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:30.745406 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:30.745369 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:30.745931 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:30.745872 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vwvb6" Apr 16 18:17:31.578809 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:31.578788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:31.578897 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:31.578881 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:31.749408 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:31.749276 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:17:31.749877 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:31.749750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"a73ac33570c3714aa57f53aaaa08dd11d95e984801dcbf20b5a6f85ccb6f5600"} Apr 16 18:17:31.750029 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:31.750013 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:31.750223 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:31.750207 2572 scope.go:117] "RemoveContainer" containerID="6dde65cb0f116bff58c7075e2bad6fef43269616a23acd646483ff75eacfa2ca" Apr 16 18:17:31.764176 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:31.764159 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:32.577975 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.577946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:32.578185 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.577979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:32.578185 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:32.578078 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:32.578294 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:32.578183 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:32.754219 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.754183 2572 generic.go:358] "Generic (PLEG): container finished" podID="711c83e1-762b-4a01-8f25-65c6c4407f6d" containerID="56f22cf369a06d6e2bde968e4a4094542d1e6fa8a0918675fe8c586aed8154d4" exitCode=0 Apr 16 18:17:32.754695 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.754285 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerDied","Data":"56f22cf369a06d6e2bde968e4a4094542d1e6fa8a0918675fe8c586aed8154d4"} Apr 16 18:17:32.761726 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.761705 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:17:32.762165 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.762133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" event={"ID":"8c9b6947-3fb0-4af0-baf5-9af029c0ab42","Type":"ContainerStarted","Data":"b02c17aa47512aa2937d5ee9ccc1b9ad8d9a0fdab25683e697e422f8832cbf57"} Apr 16 18:17:32.762245 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.762204 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:17:32.762430 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.762414 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:32.777423 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.777404 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:32.805937 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:32.805897 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" podStartSLOduration=9.318231623 podStartE2EDuration="25.805884632s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.050026927 +0000 UTC m=+3.075069708" lastFinishedPulling="2026-04-16 18:17:26.537679924 +0000 UTC m=+19.562722717" observedRunningTime="2026-04-16 18:17:32.804756842 +0000 UTC m=+25.829799641" watchObservedRunningTime="2026-04-16 18:17:32.805884632 +0000 UTC m=+25.830927429" Apr 16 18:17:33.475868 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.475837 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kn4cv"] Apr 16 18:17:33.476004 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.475948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:33.476078 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:33.476060 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:33.478218 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.478196 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rbqgb"] Apr 16 18:17:33.478324 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.478275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:33.478410 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:33.478380 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:33.478744 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.478727 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wh7d5"] Apr 16 18:17:33.478813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.478803 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:33.478884 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:33.478869 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:33.765904 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.765637 2572 generic.go:358] "Generic (PLEG): container finished" podID="711c83e1-762b-4a01-8f25-65c6c4407f6d" containerID="a57bd237f0512412fad1761c476d1ce9a7eae9297e189daa3f5f6ef6bab38890" exitCode=0 Apr 16 18:17:33.766252 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.765714 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerDied","Data":"a57bd237f0512412fad1761c476d1ce9a7eae9297e189daa3f5f6ef6bab38890"} Apr 16 18:17:33.766252 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:33.766116 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:17:34.577919 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:34.577841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:34.578079 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:34.577951 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:34.771767 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:34.771737 2572 generic.go:358] "Generic (PLEG): container finished" podID="711c83e1-762b-4a01-8f25-65c6c4407f6d" containerID="0cbfc17ff0d1029d757838e9be414b9286b9da4aa77d7ed64bcfd8d4d13d0ce0" exitCode=0 Apr 16 18:17:34.772139 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:34.771821 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerDied","Data":"0cbfc17ff0d1029d757838e9be414b9286b9da4aa77d7ed64bcfd8d4d13d0ce0"} Apr 16 18:17:34.772139 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:34.772021 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:17:35.579390 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:35.578640 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:35.579390 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:35.578651 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:35.579390 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:35.578799 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:35.579390 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:35.578831 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:35.660953 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:35.660922 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:17:36.578104 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:36.578075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:36.578682 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:36.578170 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:37.578970 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:37.578887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:37.579399 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:37.579028 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:17:37.579399 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:37.579085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:37.579399 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:37.579176 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wh7d5" podUID="8cd9794f-105a-4fbf-ae0d-7f399cb33595" Apr 16 18:17:37.645039 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:37.645004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:37.645220 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:37.645137 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:37.645220 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:37.645203 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret podName:8cd9794f-105a-4fbf-ae0d-7f399cb33595 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:53.645183763 +0000 UTC m=+46.670226545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret") pod "global-pull-secret-syncer-wh7d5" (UID: "8cd9794f-105a-4fbf-ae0d-7f399cb33595") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:38.578896 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:38.578863 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:38.579095 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:38.578979 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbqgb" podUID="f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5" Apr 16 18:17:39.350841 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.350770 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-175.ec2.internal" event="NodeReady" Apr 16 18:17:39.351027 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.350964 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:17:39.404501 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.404473 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dj5sz"] Apr 16 18:17:39.441745 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.441716 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9c9hd"] Apr 16 18:17:39.441942 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.441894 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.444178 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.444033 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:17:39.444178 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.444040 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:17:39.445788 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.445126 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m4v2s\"" Apr 16 18:17:39.457214 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.457193 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dj5sz"] Apr 16 18:17:39.457309 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.457218 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9c9hd"] Apr 16 18:17:39.457353 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.457311 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:39.459852 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.459803 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.460197 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.459870 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:17:39.460197 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.459877 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v2cgj\"" Apr 16 18:17:39.460197 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.459907 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.559290 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.559256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-tmp-dir\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.559290 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.559291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-config-volume\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.559530 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.559317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:39.559530 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.559426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.559530 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.559469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rtch\" (UniqueName: \"kubernetes.io/projected/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-kube-api-access-8rtch\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.559530 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.559496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm4vc\" (UniqueName: \"kubernetes.io/projected/5226006a-a858-4a19-a1d5-544f65d3a882-kube-api-access-lm4vc\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:39.578363 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.578333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:39.578535 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.578514 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:39.580648 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.580601 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:17:39.581034 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.580732 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:17:39.581034 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.580775 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8x9mw\"" Apr 16 18:17:39.660476 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.660443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.660645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.660493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rtch\" (UniqueName: \"kubernetes.io/projected/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-kube-api-access-8rtch\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.660645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.660514 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm4vc\" (UniqueName: \"kubernetes.io/projected/5226006a-a858-4a19-a1d5-544f65d3a882-kube-api-access-lm4vc\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:39.660645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.660552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-tmp-dir\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.660645 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:39.660561 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:39.660645 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:39.660636 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.160616354 +0000 UTC m=+33.185659131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:17:39.660905 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.660569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-config-volume\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.660905 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.660713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:39.660905 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:39.660841 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:39.660905 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:39.660901 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.160886197 +0000 UTC m=+33.185928987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:17:39.661166 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.661121 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-tmp-dir\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.661166 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.661147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-config-volume\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.671538 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.671511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rtch\" (UniqueName: \"kubernetes.io/projected/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-kube-api-access-8rtch\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:39.671645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:39.671617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm4vc\" (UniqueName: \"kubernetes.io/projected/5226006a-a858-4a19-a1d5-544f65d3a882-kube-api-access-lm4vc\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:40.164748 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:40.164716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:40.164907 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:40.164779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:40.164907 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:40.164862 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:40.164907 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:40.164868 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:40.165024 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:40.164920 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.164907407 +0000 UTC m=+34.189950183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:17:40.165024 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:40.164932 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.164926779 +0000 UTC m=+34.189969555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:17:40.578325 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:40.578303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:40.580720 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:40.580700 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:17:40.581111 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:40.581093 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-v57pk\"" Apr 16 18:17:40.581488 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:40.581471 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:17:40.786682 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:40.786493 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerStarted","Data":"b240ca866a87f3cd0402e655a7e49e66d1ffca9c1fcb45d8dbb004fa50200780"} Apr 16 18:17:41.172456 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.172368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:41.172456 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.172420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:41.172456 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.172449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:17:41.172649 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:41.172500 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:41.172649 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:41.172545 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:17:41.172649 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:41.172559 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:43.172541361 +0000 UTC m=+36.197584142 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:17:41.172649 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:41.172558 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:41.172649 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:41.172578 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:13.172568137 +0000 UTC m=+66.197610919 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : secret "metrics-daemon-secret" not found Apr 16 18:17:41.172649 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:41.172604 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:43.172591965 +0000 UTC m=+36.197634742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:17:41.272913 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.272890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:41.275451 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.275436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlg85\" (UniqueName: \"kubernetes.io/projected/f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5-kube-api-access-vlg85\") pod \"network-check-target-rbqgb\" (UID: \"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5\") " pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:41.490066 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.490047 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:41.659604 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.659570 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rbqgb"] Apr 16 18:17:41.663660 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:41.663629 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ea7281_e2af_4d20_96b3_8d0ff6b62ef5.slice/crio-8aeaa5cbfbeadd006f4f0e2a79d8d65ceedd7698af1be3146ff0404a7074c177 WatchSource:0}: Error finding container 8aeaa5cbfbeadd006f4f0e2a79d8d65ceedd7698af1be3146ff0404a7074c177: Status 404 returned error can't find the container with id 8aeaa5cbfbeadd006f4f0e2a79d8d65ceedd7698af1be3146ff0404a7074c177 Apr 16 18:17:41.790293 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.790204 2572 generic.go:358] "Generic (PLEG): container finished" podID="711c83e1-762b-4a01-8f25-65c6c4407f6d" containerID="b240ca866a87f3cd0402e655a7e49e66d1ffca9c1fcb45d8dbb004fa50200780" exitCode=0 Apr 16 18:17:41.790436 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.790295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerDied","Data":"b240ca866a87f3cd0402e655a7e49e66d1ffca9c1fcb45d8dbb004fa50200780"} Apr 16 18:17:41.791323 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:41.791302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rbqgb" event={"ID":"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5","Type":"ContainerStarted","Data":"8aeaa5cbfbeadd006f4f0e2a79d8d65ceedd7698af1be3146ff0404a7074c177"} Apr 16 18:17:42.796279 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:42.796247 2572 generic.go:358] "Generic (PLEG): container finished" podID="711c83e1-762b-4a01-8f25-65c6c4407f6d" containerID="a7152f60745e83b4ec2961968151ccddca84052baac773a6254de6b978363c17" exitCode=0 Apr 16 18:17:42.796691 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:42.796315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerDied","Data":"a7152f60745e83b4ec2961968151ccddca84052baac773a6254de6b978363c17"} Apr 16 18:17:43.183941 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:43.183753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:43.184122 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:43.183924 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:43.184122 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:43.184019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:43.184122 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:43.184094 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.18407217 +0000 UTC m=+40.209114947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:17:43.184122 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:43.184116 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:43.184296 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:43.184175 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.18414818 +0000 UTC m=+40.209190975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:17:43.801470 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:43.801437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64mfc" event={"ID":"711c83e1-762b-4a01-8f25-65c6c4407f6d","Type":"ContainerStarted","Data":"0f39ccabd1d0bc8ea489c5dd191fd0f3ba4744cc7df399208777402b8bfd6431"} Apr 16 18:17:43.827596 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:43.827541 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-64mfc" podStartSLOduration=6.349561209 podStartE2EDuration="36.827525444s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:10.056346396 +0000 UTC m=+3.081389185" lastFinishedPulling="2026-04-16 18:17:40.534310644 +0000 UTC m=+33.559353420" observedRunningTime="2026-04-16 18:17:43.826334913 +0000 UTC m=+36.851377773" watchObservedRunningTime="2026-04-16 18:17:43.827525444 +0000 UTC m=+36.852568242" Apr 16 18:17:44.804514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:44.804480 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rbqgb" event={"ID":"f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5","Type":"ContainerStarted","Data":"a84320e39aafb37c58e91de698332ce62ca8f69541a0dcd762525172e383b4e0"} Apr 16 18:17:44.804944 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:44.804921 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:17:44.820889 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:44.820839 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rbqgb" podStartSLOduration=34.891729374 podStartE2EDuration="37.820824632s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:41.665690588 +0000 UTC m=+34.690733378" lastFinishedPulling="2026-04-16 18:17:44.59478586 +0000 UTC m=+37.619828636" observedRunningTime="2026-04-16 18:17:44.819902309 +0000 UTC m=+37.844945107" watchObservedRunningTime="2026-04-16 18:17:44.820824632 +0000 UTC m=+37.845867432" Apr 16 18:17:47.212777 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:47.212744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:47.213131 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:47.212786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:47.213131 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:47.212882 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:47.213131 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:47.212937 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:55.212922773 +0000 UTC m=+48.237965550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:17:47.213131 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:47.212944 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:47.213131 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:47.212979 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:55.212969094 +0000 UTC m=+48.238011870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:17:53.654845 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:53.654809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:53.666523 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:53.666491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd9794f-105a-4fbf-ae0d-7f399cb33595-original-pull-secret\") pod \"global-pull-secret-syncer-wh7d5\" (UID: \"8cd9794f-105a-4fbf-ae0d-7f399cb33595\") " pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:53.695463 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:53.695442 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wh7d5" Apr 16 18:17:53.828970 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:53.828946 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wh7d5"] Apr 16 18:17:53.832495 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:17:53.832467 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd9794f_105a_4fbf_ae0d_7f399cb33595.slice/crio-30402ad1a2aa9d8510a3534650a0db5b9babe122fa5043f334291e1f71a4d097 WatchSource:0}: Error finding container 30402ad1a2aa9d8510a3534650a0db5b9babe122fa5043f334291e1f71a4d097: Status 404 returned error can't find the container with id 30402ad1a2aa9d8510a3534650a0db5b9babe122fa5043f334291e1f71a4d097 Apr 16 18:17:54.823795 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:54.823761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wh7d5" event={"ID":"8cd9794f-105a-4fbf-ae0d-7f399cb33595","Type":"ContainerStarted","Data":"30402ad1a2aa9d8510a3534650a0db5b9babe122fa5043f334291e1f71a4d097"} Apr 16 18:17:55.268721 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:55.268690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:17:55.268902 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:55.268757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:17:55.268902 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:55.268862 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:55.268902 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:55.268877 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:55.269058 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:55.268938 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.268918532 +0000 UTC m=+64.293961323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:17:55.269058 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:17:55.268956 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.268947644 +0000 UTC m=+64.293990424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:17:58.832267 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:58.832230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wh7d5" event={"ID":"8cd9794f-105a-4fbf-ae0d-7f399cb33595","Type":"ContainerStarted","Data":"3b3f9144226a4497fee794127b2e0e3f9c86a82b6b5382b23e89e67625d904ce"} Apr 16 18:17:58.851645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:17:58.851599 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wh7d5" podStartSLOduration=33.955023199 podStartE2EDuration="37.851587121s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:53.834515133 +0000 UTC m=+46.859557909" lastFinishedPulling="2026-04-16 18:17:57.731079041 +0000 UTC m=+50.756121831" observedRunningTime="2026-04-16 18:17:58.850107016 +0000 UTC m=+51.875149811" watchObservedRunningTime="2026-04-16 18:17:58.851587121 +0000 UTC m=+51.876629954" Apr 16 18:18:05.783689 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:18:05.783659 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vpdh" Apr 16 18:18:11.272109 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:18:11.272074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:18:11.272470 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:18:11.272128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:18:11.272470 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:11.272212 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:11.272470 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:11.272215 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:11.272470 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:11.272262 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:43.272248432 +0000 UTC m=+96.297291208 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:18:11.272470 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:11.272273 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:43.272267694 +0000 UTC m=+96.297310470 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:18:13.185377 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:18:13.185343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:18:13.185817 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:13.185442 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:13.185817 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:13.185493 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:17.185480188 +0000 UTC m=+130.210522964 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : secret "metrics-daemon-secret" not found Apr 16 18:18:16.809795 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:18:16.809770 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rbqgb" Apr 16 18:18:43.273436 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:18:43.273403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:18:43.273791 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:18:43.273448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:18:43.273791 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:43.273552 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:43.273791 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:43.273618 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:47.273603051 +0000 UTC m=+160.298645831 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:18:43.273791 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:43.273555 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:43.273791 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:18:43.273720 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:47.273705981 +0000 UTC m=+160.298748757 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:19:17.189241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:17.189188 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:19:17.189745 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:17.189367 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:19:17.189745 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:17.189452 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs podName:8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:19.18943173 +0000 UTC m=+252.214474516 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs") pod "network-metrics-daemon-kn4cv" (UID: "8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7") : secret "metrics-daemon-secret" not found Apr 16 18:19:42.454804 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:42.454763 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dj5sz" podUID="528ce5b9-64df-485a-8269-fb5ba8dc8ba5" Apr 16 18:19:42.466906 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:42.466884 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9c9hd" podUID="5226006a-a858-4a19-a1d5-544f65d3a882" Apr 16 18:19:42.588905 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:42.588874 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kn4cv" podUID="8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7" Apr 16 18:19:43.017484 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:43.017454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj5sz" Apr 16 18:19:47.281521 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.281481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:19:47.281948 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.281544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:19:47.281948 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.281638 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:19:47.281948 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.281715 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls podName:528ce5b9-64df-485a-8269-fb5ba8dc8ba5 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:49.281699663 +0000 UTC m=+282.306742444 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls") pod "dns-default-dj5sz" (UID: "528ce5b9-64df-485a-8269-fb5ba8dc8ba5") : secret "dns-default-metrics-tls" not found Apr 16 18:19:47.281948 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.281638 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:19:47.281948 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.281801 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert podName:5226006a-a858-4a19-a1d5-544f65d3a882 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:49.281788872 +0000 UTC m=+282.306831648 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert") pod "ingress-canary-9c9hd" (UID: "5226006a-a858-4a19-a1d5-544f65d3a882") : secret "canary-serving-cert" not found Apr 16 18:19:47.649511 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.649438 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xmqgp"] Apr 16 18:19:47.651256 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.651236 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.652029 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.651984 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5fc65bbfd4-n74q6"] Apr 16 18:19:47.653262 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.653239 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:19:47.653629 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.653608 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:19:47.653757 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.653740 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.653935 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.653917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.653935 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.653929 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2tmbw\"" Apr 16 18:19:47.654095 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.653933 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.656087 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.656065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-q4l8n\"" Apr 16 18:19:47.656153 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.656094 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.656259 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.656246 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:19:47.656370 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.656353 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.656469 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.656450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:19:47.656584 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.656570 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:19:47.657231 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.657208 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:19:47.661357 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.661334 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xmqgp"] Apr 16 18:19:47.661929 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.661909 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:19:47.668723 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.668703 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5fc65bbfd4-n74q6"] Apr 16 18:19:47.761915 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.761889 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67cbdb9f8c-86drx"] Apr 16 18:19:47.763622 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.763607 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.769528 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.769506 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"image-registry-tls\" is forbidden: User \"system:node:ip-10-0-138-175.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-138-175.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" type="*v1.Secret" Apr 16 18:19:47.769716 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.769699 2572 status_manager.go:895] "Failed to get status for pod" podUID="5f9ca5b0-c6cc-419a-9f78-6f906e41d004" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" err="pods \"image-registry-67cbdb9f8c-86drx\" is forbidden: User \"system:node:ip-10-0-138-175.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-138-175.ec2.internal' and this object" Apr 16 18:19:47.770013 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.769973 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"registry-dockercfg-59g6h\" is forbidden: User \"system:node:ip-10-0-138-175.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-138-175.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-59g6h\"" type="*v1.Secret" Apr 16 18:19:47.770459 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.770442 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"image-registry-private-configuration\" is forbidden: User \"system:node:ip-10-0-138-175.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-138-175.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" type="*v1.Secret" Apr 16 18:19:47.773381 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.773365 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:19:47.785030 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785008 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-tmp\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.785119 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-stats-auth\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.785119 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785056 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5hm\" (UniqueName: \"kubernetes.io/projected/49cc1dab-4e51-4560-8096-2b481c666fa4-kube-api-access-lt5hm\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.785119 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785105 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.785230 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785184 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-snapshots\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.785230 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7p8x\" (UniqueName: \"kubernetes.io/projected/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-kube-api-access-n7p8x\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.785289 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.785320 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.785352 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-serving-cert\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.785352 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.785437 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.785368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-default-certificate\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.790919 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.790898 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:19:47.795039 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.795021 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67cbdb9f8c-86drx"] Apr 16 18:19:47.886053 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.886152 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886062 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-snapshots\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.886152 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7p8x\" (UniqueName: \"kubernetes.io/projected/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-kube-api-access-n7p8x\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.886152 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-image-registry-private-configuration\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886296 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-certificates\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886340 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-trusted-ca\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-installation-pull-secrets\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-bound-sa-token\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886526 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886398 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-ca-trust-extracted\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886526 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkmtr\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-kube-api-access-wkmtr\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.886526 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.886526 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.886526 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-serving-cert\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-default-certificate\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.886582 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-tmp\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.886653 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:48.38663639 +0000 UTC m=+161.411679173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-stats-auth\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5hm\" (UniqueName: \"kubernetes.io/projected/49cc1dab-4e51-4560-8096-2b481c666fa4-kube-api-access-lt5hm\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.886745 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-snapshots\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.887065 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:47.886758 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:48.38674028 +0000 UTC m=+161.411783065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : secret "router-metrics-certs-default" not found Apr 16 18:19:47.887065 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.886769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-tmp\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.887065 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.887052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.887171 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.887084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.888854 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.888831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-serving-cert\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.889443 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.889423 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-default-certificate\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.889510 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.889433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-stats-auth\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.903643 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.903594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5hm\" (UniqueName: \"kubernetes.io/projected/49cc1dab-4e51-4560-8096-2b481c666fa4-kube-api-access-lt5hm\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:47.903755 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.903738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7p8x\" (UniqueName: \"kubernetes.io/projected/1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d-kube-api-access-n7p8x\") pod \"insights-operator-5785d4fcdd-xmqgp\" (UID: \"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.961678 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.961656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" Apr 16 18:19:47.987537 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-image-registry-private-configuration\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.987640 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-certificates\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.987640 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-trusted-ca\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.987640 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.987640 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-installation-pull-secrets\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.987640 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-bound-sa-token\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.987640 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-ca-trust-extracted\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.987926 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.987660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkmtr\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-kube-api-access-wkmtr\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.988064 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.988047 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-ca-trust-extracted\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.988451 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.988429 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-certificates\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.988573 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.988559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-trusted-ca\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.989610 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.989594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-installation-pull-secrets\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.996526 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.996502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-bound-sa-token\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:47.996763 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:47.996749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkmtr\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-kube-api-access-wkmtr\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:48.071509 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:48.071484 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xmqgp"] Apr 16 18:19:48.076431 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:19:48.076407 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c91bbcf_fe4a_4f0d_b7c2_41f6d8de652d.slice/crio-e217a827ec2ba301f708d0197277472fd0279b8f2bb83c63ce7456ef39863979 WatchSource:0}: Error finding container e217a827ec2ba301f708d0197277472fd0279b8f2bb83c63ce7456ef39863979: Status 404 returned error can't find the container with id e217a827ec2ba301f708d0197277472fd0279b8f2bb83c63ce7456ef39863979 Apr 16 18:19:48.391681 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:48.391650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:48.392037 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:48.391695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:48.392037 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:48.391794 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:48.392037 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:48.391828 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:49.391814225 +0000 UTC m=+162.416857001 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:48.392037 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:48.391852 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:49.391835788 +0000 UTC m=+162.416878564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : secret "router-metrics-certs-default" not found Apr 16 18:19:48.756895 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:48.756872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:19:48.760049 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:48.760024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-image-registry-private-configuration\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:48.797625 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:48.797605 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:19:48.797798 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:48.797786 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:48.797862 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:48.797800 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbdb9f8c-86drx: secret "image-registry-tls" not found Apr 16 18:19:48.797862 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:48.797839 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls podName:5f9ca5b0-c6cc-419a-9f78-6f906e41d004 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:49.29782644 +0000 UTC m=+162.322869216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls") pod "image-registry-67cbdb9f8c-86drx" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004") : secret "image-registry-tls" not found Apr 16 18:19:49.032072 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:49.032007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" event={"ID":"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d","Type":"ContainerStarted","Data":"e217a827ec2ba301f708d0197277472fd0279b8f2bb83c63ce7456ef39863979"} Apr 16 18:19:49.298320 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:49.298228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:49.298492 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:49.298380 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:49.298492 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:49.298395 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbdb9f8c-86drx: secret "image-registry-tls" not found Apr 16 18:19:49.298492 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:49.298455 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls podName:5f9ca5b0-c6cc-419a-9f78-6f906e41d004 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:50.298440462 +0000 UTC m=+163.323483241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls") pod "image-registry-67cbdb9f8c-86drx" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004") : secret "image-registry-tls" not found Apr 16 18:19:49.351708 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:49.351681 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-59g6h\"" Apr 16 18:19:49.399260 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:49.399228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:49.399598 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:49.399283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:49.399598 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:49.399376 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:49.399598 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:49.399418 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:51.399399788 +0000 UTC m=+164.424442565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:49.399598 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:49.399436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:51.399427106 +0000 UTC m=+164.424469886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : secret "router-metrics-certs-default" not found Apr 16 18:19:50.035323 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:50.035258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" event={"ID":"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d","Type":"ContainerStarted","Data":"b3706dd42b075a7aa995f4fb0674a3565b2c5e864b7eb2912d080ee35e093ef7"} Apr 16 18:19:50.307414 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:50.307352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:50.307541 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:50.307445 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:50.307541 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:50.307459 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbdb9f8c-86drx: secret "image-registry-tls" not found Apr 16 18:19:50.307541 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:50.307513 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls podName:5f9ca5b0-c6cc-419a-9f78-6f906e41d004 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:52.307496241 +0000 UTC m=+165.332539022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls") pod "image-registry-67cbdb9f8c-86drx" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004") : secret "image-registry-tls" not found Apr 16 18:19:51.414592 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:51.414552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:51.414592 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:51.414598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:51.415012 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:51.414695 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:51.415012 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:51.414716 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:55.414701967 +0000 UTC m=+168.439744746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:51.415012 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:51.414749 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:55.414736399 +0000 UTC m=+168.439779175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : secret "router-metrics-certs-default" not found Apr 16 18:19:52.320088 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:52.320046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:52.320260 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:52.320193 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:52.320260 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:52.320207 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbdb9f8c-86drx: secret "image-registry-tls" not found Apr 16 18:19:52.320260 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:52.320257 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls podName:5f9ca5b0-c6cc-419a-9f78-6f906e41d004 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:56.320241317 +0000 UTC m=+169.345284147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls") pod "image-registry-67cbdb9f8c-86drx" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004") : secret "image-registry-tls" not found Apr 16 18:19:53.412867 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:53.412839 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xxr9z_dc04a45e-8316-4c0a-86a5-12c986bd0756/dns-node-resolver/0.log" Apr 16 18:19:53.578140 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:53.578110 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:19:54.412916 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:54.412888 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v22jh_ebe5bfb0-23f3-4c66-9cc1-2436ea624b37/node-ca/0.log" Apr 16 18:19:55.443242 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:55.443209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:55.443617 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:55.443251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:19:55.443617 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:55.443360 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:55.443617 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:55.443383 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:03.44336523 +0000 UTC m=+176.468408006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:55.443617 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:55.443401 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs podName:49cc1dab-4e51-4560-8096-2b481c666fa4 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:03.443390686 +0000 UTC m=+176.468433476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs") pod "router-default-5fc65bbfd4-n74q6" (UID: "49cc1dab-4e51-4560-8096-2b481c666fa4") : secret "router-metrics-certs-default" not found Apr 16 18:19:56.348458 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:56.348381 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:19:56.348613 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:56.348489 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:56.348613 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:56.348503 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbdb9f8c-86drx: secret "image-registry-tls" not found Apr 16 18:19:56.348613 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:19:56.348556 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls podName:5f9ca5b0-c6cc-419a-9f78-6f906e41d004 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:04.348539148 +0000 UTC m=+177.373581931 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls") pod "image-registry-67cbdb9f8c-86drx" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004") : secret "image-registry-tls" not found Apr 16 18:19:56.578209 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:19:56.578176 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:20:03.502633 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:03.502600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:03.503011 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:03.502642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:03.503298 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:03.503278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49cc1dab-4e51-4560-8096-2b481c666fa4-service-ca-bundle\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:03.504979 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:03.504940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49cc1dab-4e51-4560-8096-2b481c666fa4-metrics-certs\") pod \"router-default-5fc65bbfd4-n74q6\" (UID: \"49cc1dab-4e51-4560-8096-2b481c666fa4\") " pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:03.567626 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:03.567600 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:03.677788 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:03.677737 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" podStartSLOduration=14.986226330000001 podStartE2EDuration="16.677719846s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:19:48.078080875 +0000 UTC m=+161.103123651" lastFinishedPulling="2026-04-16 18:19:49.769574391 +0000 UTC m=+162.794617167" observedRunningTime="2026-04-16 18:19:50.072768447 +0000 UTC m=+163.097811246" watchObservedRunningTime="2026-04-16 18:20:03.677719846 +0000 UTC m=+176.702762643" Apr 16 18:20:03.678014 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:03.677999 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5fc65bbfd4-n74q6"] Apr 16 18:20:03.681498 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:03.681473 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49cc1dab_4e51_4560_8096_2b481c666fa4.slice/crio-d5e67a7d07c4e594fcc55c667360c2bfdeb83d2fe5250b0b1eb36534cfd94ca2 WatchSource:0}: Error finding container d5e67a7d07c4e594fcc55c667360c2bfdeb83d2fe5250b0b1eb36534cfd94ca2: Status 404 returned error can't find the container with id d5e67a7d07c4e594fcc55c667360c2bfdeb83d2fe5250b0b1eb36534cfd94ca2 Apr 16 18:20:04.064189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.064154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" event={"ID":"49cc1dab-4e51-4560-8096-2b481c666fa4","Type":"ContainerStarted","Data":"a51035110319ab2600106227bcc75c39edec24babf8fe72d1a5b60dbfd4732e6"} Apr 16 18:20:04.064189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.064188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" event={"ID":"49cc1dab-4e51-4560-8096-2b481c666fa4","Type":"ContainerStarted","Data":"d5e67a7d07c4e594fcc55c667360c2bfdeb83d2fe5250b0b1eb36534cfd94ca2"} Apr 16 18:20:04.088493 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.088454 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" podStartSLOduration=17.088440207 podStartE2EDuration="17.088440207s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:04.087435433 +0000 UTC m=+177.112478232" watchObservedRunningTime="2026-04-16 18:20:04.088440207 +0000 UTC m=+177.113483034" Apr 16 18:20:04.409760 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.409697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:20:04.411790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.411768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"image-registry-67cbdb9f8c-86drx\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:20:04.568421 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.568399 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:04.570741 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.570722 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:04.574639 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.574622 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:20:04.690260 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:04.690183 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67cbdb9f8c-86drx"] Apr 16 18:20:04.692584 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:04.692555 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9ca5b0_c6cc_419a_9f78_6f906e41d004.slice/crio-1eaaceb8faff2cdbecda1a674cec46ec1936fdb8fd85c81837aff4e7bcbb5cdb WatchSource:0}: Error finding container 1eaaceb8faff2cdbecda1a674cec46ec1936fdb8fd85c81837aff4e7bcbb5cdb: Status 404 returned error can't find the container with id 1eaaceb8faff2cdbecda1a674cec46ec1936fdb8fd85c81837aff4e7bcbb5cdb Apr 16 18:20:05.068619 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:05.068582 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" event={"ID":"5f9ca5b0-c6cc-419a-9f78-6f906e41d004","Type":"ContainerStarted","Data":"27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2"} Apr 16 18:20:05.068619 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:05.068622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" event={"ID":"5f9ca5b0-c6cc-419a-9f78-6f906e41d004","Type":"ContainerStarted","Data":"1eaaceb8faff2cdbecda1a674cec46ec1936fdb8fd85c81837aff4e7bcbb5cdb"} Apr 16 18:20:05.068850 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:05.068809 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:05.068850 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:05.068840 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:20:05.070034 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:05.070013 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5fc65bbfd4-n74q6" Apr 16 18:20:05.089069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:05.089032 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" podStartSLOduration=18.089018946 podStartE2EDuration="18.089018946s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:05.087973864 +0000 UTC m=+178.113016665" watchObservedRunningTime="2026-04-16 18:20:05.089018946 +0000 UTC m=+178.114061744" Apr 16 18:20:15.245474 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.245438 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67cbdb9f8c-86drx"] Apr 16 18:20:15.273006 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.272972 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rhp8h"] Apr 16 18:20:15.276459 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.276445 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.279545 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.279525 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hm4gx\"" Apr 16 18:20:15.279676 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.279640 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:20:15.280471 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.280452 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:20:15.297286 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.297267 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rhp8h"] Apr 16 18:20:15.346363 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.346342 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8d7db7-222bz"] Apr 16 18:20:15.348077 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.348064 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.364813 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.364797 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8d7db7-222bz"] Apr 16 18:20:15.382982 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.382961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-crio-socket\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.383085 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.383003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zxk\" (UniqueName: \"kubernetes.io/projected/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-kube-api-access-m8zxk\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.383085 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.383035 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.383154 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.383125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-data-volume\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.383186 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.383165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.484425 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.484519 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d96497b-7167-4484-9a0e-c7052ca624e7-installation-pull-secrets\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.484519 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-data-volume\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.484519 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d96497b-7167-4484-9a0e-c7052ca624e7-trusted-ca\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.484675 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484566 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.484675 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-crio-socket\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.484675 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d96497b-7167-4484-9a0e-c7052ca624e7-ca-trust-extracted\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.484823 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zxk\" (UniqueName: \"kubernetes.io/projected/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-kube-api-access-m8zxk\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.484823 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-registry-tls\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.484823 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfr8h\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-kube-api-access-cfr8h\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.484823 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-data-volume\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.484823 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-crio-socket\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.485047 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-bound-sa-token\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.485047 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d96497b-7167-4484-9a0e-c7052ca624e7-image-registry-private-configuration\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.485047 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d96497b-7167-4484-9a0e-c7052ca624e7-registry-certificates\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.485047 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.484966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.486802 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.486785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.493434 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.493415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zxk\" (UniqueName: \"kubernetes.io/projected/d1a9a0b8-ff7a-46a5-90b6-9cac55628412-kube-api-access-m8zxk\") pod \"insights-runtime-extractor-rhp8h\" (UID: \"d1a9a0b8-ff7a-46a5-90b6-9cac55628412\") " pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.584588 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.584570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rhp8h" Apr 16 18:20:15.586148 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d96497b-7167-4484-9a0e-c7052ca624e7-installation-pull-secrets\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586204 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d96497b-7167-4484-9a0e-c7052ca624e7-trusted-ca\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586446 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d96497b-7167-4484-9a0e-c7052ca624e7-ca-trust-extracted\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586531 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-registry-tls\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586531 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfr8h\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-kube-api-access-cfr8h\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586531 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-bound-sa-token\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586666 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586558 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d96497b-7167-4484-9a0e-c7052ca624e7-image-registry-private-configuration\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586666 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d96497b-7167-4484-9a0e-c7052ca624e7-registry-certificates\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.586846 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.586820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d96497b-7167-4484-9a0e-c7052ca624e7-ca-trust-extracted\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.587095 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.587066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d96497b-7167-4484-9a0e-c7052ca624e7-trusted-ca\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.587433 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.587401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d96497b-7167-4484-9a0e-c7052ca624e7-registry-certificates\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.588818 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.588786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d96497b-7167-4484-9a0e-c7052ca624e7-installation-pull-secrets\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.589121 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.589101 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d96497b-7167-4484-9a0e-c7052ca624e7-image-registry-private-configuration\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.589266 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.589245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-registry-tls\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.595872 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.595848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfr8h\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-kube-api-access-cfr8h\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.596301 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.596279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d96497b-7167-4484-9a0e-c7052ca624e7-bound-sa-token\") pod \"image-registry-8d7db7-222bz\" (UID: \"3d96497b-7167-4484-9a0e-c7052ca624e7\") " pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.656333 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.656296 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:15.699306 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.699279 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rhp8h"] Apr 16 18:20:15.703274 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:15.703235 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a9a0b8_ff7a_46a5_90b6_9cac55628412.slice/crio-ffb21dcaa7751b85210097df75797a7b11d8e186b6500dee099e63c74b35c06c WatchSource:0}: Error finding container ffb21dcaa7751b85210097df75797a7b11d8e186b6500dee099e63c74b35c06c: Status 404 returned error can't find the container with id ffb21dcaa7751b85210097df75797a7b11d8e186b6500dee099e63c74b35c06c Apr 16 18:20:15.775925 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:15.775897 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8d7db7-222bz"] Apr 16 18:20:15.779031 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:15.779010 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d96497b_7167_4484_9a0e_c7052ca624e7.slice/crio-291e502a97bdab1b6e95f9646f0b9c86a5fe09430bdd5c0c51a06f68ffccf196 WatchSource:0}: Error finding container 291e502a97bdab1b6e95f9646f0b9c86a5fe09430bdd5c0c51a06f68ffccf196: Status 404 returned error can't find the container with id 291e502a97bdab1b6e95f9646f0b9c86a5fe09430bdd5c0c51a06f68ffccf196 Apr 16 18:20:16.099306 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:16.099235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rhp8h" event={"ID":"d1a9a0b8-ff7a-46a5-90b6-9cac55628412","Type":"ContainerStarted","Data":"39bd3806f424304b4b6c8e1e2a8caec4b531378bde77f7088fd92d682b2b95ed"} Apr 16 18:20:16.099306 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:16.099270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rhp8h" event={"ID":"d1a9a0b8-ff7a-46a5-90b6-9cac55628412","Type":"ContainerStarted","Data":"ffb21dcaa7751b85210097df75797a7b11d8e186b6500dee099e63c74b35c06c"} Apr 16 18:20:16.100496 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:16.100468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8d7db7-222bz" event={"ID":"3d96497b-7167-4484-9a0e-c7052ca624e7","Type":"ContainerStarted","Data":"c01a2a765ae0c5312a442049647fe272fc1bdd6ef43446af71a6e76d6e24ef6c"} Apr 16 18:20:16.100612 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:16.100501 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8d7db7-222bz" event={"ID":"3d96497b-7167-4484-9a0e-c7052ca624e7","Type":"ContainerStarted","Data":"291e502a97bdab1b6e95f9646f0b9c86a5fe09430bdd5c0c51a06f68ffccf196"} Apr 16 18:20:16.100612 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:16.100581 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:16.120879 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:16.120845 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8d7db7-222bz" podStartSLOduration=1.120832832 podStartE2EDuration="1.120832832s" podCreationTimestamp="2026-04-16 18:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:16.120137577 +0000 UTC m=+189.145180398" watchObservedRunningTime="2026-04-16 18:20:16.120832832 +0000 UTC m=+189.145875630" Apr 16 18:20:17.104649 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:17.104614 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rhp8h" event={"ID":"d1a9a0b8-ff7a-46a5-90b6-9cac55628412","Type":"ContainerStarted","Data":"8c7706b7bc491e971735cc7512f377b066cd88a3a402e63f6ed78f4793ef86d8"} Apr 16 18:20:18.112827 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:18.112759 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rhp8h" event={"ID":"d1a9a0b8-ff7a-46a5-90b6-9cac55628412","Type":"ContainerStarted","Data":"4b8f77d6a568d77276146368d2c960f006b2669d1eb9e219ee2a8c41bbf07a86"} Apr 16 18:20:18.139812 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:18.139758 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rhp8h" podStartSLOduration=1.061675411 podStartE2EDuration="3.139739806s" podCreationTimestamp="2026-04-16 18:20:15 +0000 UTC" firstStartedPulling="2026-04-16 18:20:15.755139056 +0000 UTC m=+188.780181833" lastFinishedPulling="2026-04-16 18:20:17.833203448 +0000 UTC m=+190.858246228" observedRunningTime="2026-04-16 18:20:18.139235787 +0000 UTC m=+191.164278626" watchObservedRunningTime="2026-04-16 18:20:18.139739806 +0000 UTC m=+191.164782606" Apr 16 18:20:25.250784 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:25.250745 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:20:30.413690 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.413659 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jk75h"] Apr 16 18:20:30.415708 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.415692 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.417978 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.417950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k6cqr\"" Apr 16 18:20:30.417978 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.417966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:20:30.418210 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.418032 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:20:30.418210 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.418032 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:20:30.418288 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.418226 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:20:30.418742 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.418728 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:20:30.418888 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.418872 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:20:30.484415 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-wtmp\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-tls\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-textfile\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e446bf9-2453-42ef-af7d-39fb7389ba58-metrics-client-ca\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484608 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484520 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szmn\" (UniqueName: \"kubernetes.io/projected/7e446bf9-2453-42ef-af7d-39fb7389ba58-kube-api-access-2szmn\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484608 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484608 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-sys\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484608 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-root\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.484723 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.484628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-accelerators-collector-config\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.585784 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-accelerators-collector-config\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.585888 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-wtmp\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.585888 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-tls\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.585888 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-textfile\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.585888 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e446bf9-2453-42ef-af7d-39fb7389ba58-metrics-client-ca\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2szmn\" (UniqueName: \"kubernetes.io/projected/7e446bf9-2453-42ef-af7d-39fb7389ba58-kube-api-access-2szmn\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-sys\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.585976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-root\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.586052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-wtmp\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.586054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-root\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.586052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e446bf9-2453-42ef-af7d-39fb7389ba58-sys\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586401 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.586222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-textfile\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586454 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.586397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-accelerators-collector-config\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.586454 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.586405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e446bf9-2453-42ef-af7d-39fb7389ba58-metrics-client-ca\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.588251 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.588224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-tls\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.588356 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.588280 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e446bf9-2453-42ef-af7d-39fb7389ba58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.596982 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.596964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szmn\" (UniqueName: \"kubernetes.io/projected/7e446bf9-2453-42ef-af7d-39fb7389ba58-kube-api-access-2szmn\") pod \"node-exporter-jk75h\" (UID: \"7e446bf9-2453-42ef-af7d-39fb7389ba58\") " pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.723922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:30.723901 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jk75h" Apr 16 18:20:30.732404 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:30.732378 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e446bf9_2453_42ef_af7d_39fb7389ba58.slice/crio-a758921616c3d01158adc45f6221be215e24976f570bfb3a51410a980f211e41 WatchSource:0}: Error finding container a758921616c3d01158adc45f6221be215e24976f570bfb3a51410a980f211e41: Status 404 returned error can't find the container with id a758921616c3d01158adc45f6221be215e24976f570bfb3a51410a980f211e41 Apr 16 18:20:31.143751 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.143675 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jk75h" event={"ID":"7e446bf9-2453-42ef-af7d-39fb7389ba58","Type":"ContainerStarted","Data":"a758921616c3d01158adc45f6221be215e24976f570bfb3a51410a980f211e41"} Apr 16 18:20:31.480665 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.480636 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:31.482828 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.482805 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.485509 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.485486 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:20:31.485778 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.485758 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-659xp\"" Apr 16 18:20:31.485857 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.485798 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:20:31.485918 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.485798 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:20:31.485918 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.485803 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:20:31.486064 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.486052 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:20:31.486173 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.486145 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:20:31.486173 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.486150 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:20:31.486278 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.486172 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:20:31.486504 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.486482 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:20:31.497240 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.497220 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:31.592509 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592629 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592629 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592629 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592585 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-web-config\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592629 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592612 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-config-volume\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592672 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm85l\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-kube-api-access-cm85l\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.592790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.593051 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-config-out\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.593051 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.593051 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.592977 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.693865 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.693846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.693948 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.693876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.693948 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.693908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.693948 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.693931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694110 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.693971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694110 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-web-config\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694110 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-config-volume\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694110 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694289 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm85l\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-kube-api-access-cm85l\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694396 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:20:31.694346 2572 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:20:31.694396 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694360 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694517 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:20:31.694403 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls podName:e172067d-0626-43f2-a977-f587273b6e98 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:32.194384697 +0000 UTC m=+205.219427488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "e172067d-0626-43f2-a977-f587273b6e98") : secret "alertmanager-main-tls" not found Apr 16 18:20:31.694517 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694517 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694517 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.694723 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-config-out\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.695024 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.694788 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.695209 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.695178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.696712 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.696648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.696804 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.696739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-web-config\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.696863 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.696826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-config-out\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.697131 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.697110 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.697220 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.697165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-config-volume\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.697723 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.697695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.697844 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.697825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.697914 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.697828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:31.706443 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:31.706425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm85l\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-kube-api-access-cm85l\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:32.148049 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:32.148014 2572 generic.go:358] "Generic (PLEG): container finished" podID="7e446bf9-2453-42ef-af7d-39fb7389ba58" containerID="e74af3838101fd60c8704234c9934638db565fe8b1904ba6f3c025f712966ecc" exitCode=0 Apr 16 18:20:32.148204 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:32.148107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jk75h" event={"ID":"7e446bf9-2453-42ef-af7d-39fb7389ba58","Type":"ContainerDied","Data":"e74af3838101fd60c8704234c9934638db565fe8b1904ba6f3c025f712966ecc"} Apr 16 18:20:32.197347 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:32.197318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:32.199312 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:32.199293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:32.392427 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:32.392403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:32.523543 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:32.523519 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:32.525061 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:32.525035 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode172067d_0626_43f2_a977_f587273b6e98.slice/crio-350c85e888b4520d9c1c9f7b88f261a8b47c44483101af795f043438811d9b41 WatchSource:0}: Error finding container 350c85e888b4520d9c1c9f7b88f261a8b47c44483101af795f043438811d9b41: Status 404 returned error can't find the container with id 350c85e888b4520d9c1c9f7b88f261a8b47c44483101af795f043438811d9b41 Apr 16 18:20:33.153067 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:33.152956 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jk75h" event={"ID":"7e446bf9-2453-42ef-af7d-39fb7389ba58","Type":"ContainerStarted","Data":"a5ecdedad11075b1a3b5d920496be5c32f28d523085d5a61c4ef92392193d0ae"} Apr 16 18:20:33.153067 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:33.153014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jk75h" event={"ID":"7e446bf9-2453-42ef-af7d-39fb7389ba58","Type":"ContainerStarted","Data":"6def3b4eb63087d8651b80beb930ee8e5451e6c434279a04e59fae693a4e132b"} Apr 16 18:20:33.154034 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:33.154001 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerStarted","Data":"350c85e888b4520d9c1c9f7b88f261a8b47c44483101af795f043438811d9b41"} Apr 16 18:20:33.175072 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:33.175034 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jk75h" podStartSLOduration=2.415251552 podStartE2EDuration="3.175022897s" podCreationTimestamp="2026-04-16 18:20:30 +0000 UTC" firstStartedPulling="2026-04-16 18:20:30.734489673 +0000 UTC m=+203.759532452" lastFinishedPulling="2026-04-16 18:20:31.494261006 +0000 UTC m=+204.519303797" observedRunningTime="2026-04-16 18:20:33.1738311 +0000 UTC m=+206.198873897" watchObservedRunningTime="2026-04-16 18:20:33.175022897 +0000 UTC m=+206.200065695" Apr 16 18:20:34.158870 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:34.158833 2572 generic.go:358] "Generic (PLEG): container finished" podID="e172067d-0626-43f2-a977-f587273b6e98" containerID="cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41" exitCode=0 Apr 16 18:20:34.159294 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:34.158924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41"} Apr 16 18:20:36.167922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:36.167844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerStarted","Data":"264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418"} Apr 16 18:20:36.167922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:36.167880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerStarted","Data":"7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636"} Apr 16 18:20:36.167922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:36.167900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerStarted","Data":"fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0"} Apr 16 18:20:36.167922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:36.167910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerStarted","Data":"d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a"} Apr 16 18:20:36.167922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:36.167919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerStarted","Data":"eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70"} Apr 16 18:20:37.108472 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:37.108444 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8d7db7-222bz" Apr 16 18:20:37.173403 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:37.173371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerStarted","Data":"1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff"} Apr 16 18:20:37.203241 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:37.203200 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.394097523 podStartE2EDuration="6.203188031s" podCreationTimestamp="2026-04-16 18:20:31 +0000 UTC" firstStartedPulling="2026-04-16 18:20:32.526847432 +0000 UTC m=+205.551890212" lastFinishedPulling="2026-04-16 18:20:36.33593793 +0000 UTC m=+209.360980720" observedRunningTime="2026-04-16 18:20:37.203114133 +0000 UTC m=+210.228156944" watchObservedRunningTime="2026-04-16 18:20:37.203188031 +0000 UTC m=+210.228230871" Apr 16 18:20:40.262938 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.262862 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" podUID="5f9ca5b0-c6cc-419a-9f78-6f906e41d004" containerName="registry" containerID="cri-o://27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2" gracePeriod=30 Apr 16 18:20:40.494406 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.494385 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:20:40.560429 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560365 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-installation-pull-secrets\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560429 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560401 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-trusted-ca\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560618 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560433 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560618 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560462 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-ca-trust-extracted\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560618 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560480 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-certificates\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560618 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560499 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkmtr\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-kube-api-access-wkmtr\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560618 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560519 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-image-registry-private-configuration\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560618 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560538 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-bound-sa-token\") pod \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\" (UID: \"5f9ca5b0-c6cc-419a-9f78-6f906e41d004\") " Apr 16 18:20:40.560869 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.560818 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:40.561263 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.561230 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:40.562864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.562819 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:40.562864 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.562835 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:40.563059 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.562925 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-kube-api-access-wkmtr" (OuterVolumeSpecName: "kube-api-access-wkmtr") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "kube-api-access-wkmtr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:40.563158 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.563136 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:40.563194 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.563164 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:40.569121 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.569099 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5f9ca5b0-c6cc-419a-9f78-6f906e41d004" (UID: "5f9ca5b0-c6cc-419a-9f78-6f906e41d004"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:40.661930 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661909 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-tls\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.661930 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661929 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-ca-trust-extracted\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.662060 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661938 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-registry-certificates\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.662060 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661949 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wkmtr\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-kube-api-access-wkmtr\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.662060 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661958 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-image-registry-private-configuration\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.662060 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661967 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-bound-sa-token\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.662060 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661976 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-installation-pull-secrets\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.662060 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:40.661984 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ca5b0-c6cc-419a-9f78-6f906e41d004-trusted-ca\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:20:41.185802 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.185769 2572 generic.go:358] "Generic (PLEG): container finished" podID="5f9ca5b0-c6cc-419a-9f78-6f906e41d004" containerID="27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2" exitCode=0 Apr 16 18:20:41.185914 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.185844 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" Apr 16 18:20:41.185914 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.185862 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" event={"ID":"5f9ca5b0-c6cc-419a-9f78-6f906e41d004","Type":"ContainerDied","Data":"27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2"} Apr 16 18:20:41.185914 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.185911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbdb9f8c-86drx" event={"ID":"5f9ca5b0-c6cc-419a-9f78-6f906e41d004","Type":"ContainerDied","Data":"1eaaceb8faff2cdbecda1a674cec46ec1936fdb8fd85c81837aff4e7bcbb5cdb"} Apr 16 18:20:41.186044 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.185930 2572 scope.go:117] "RemoveContainer" containerID="27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2" Apr 16 18:20:41.195789 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.195772 2572 scope.go:117] "RemoveContainer" containerID="27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2" Apr 16 18:20:41.196144 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:20:41.196120 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2\": container with ID starting with 27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2 not found: ID does not exist" containerID="27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2" Apr 16 18:20:41.196221 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.196155 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2"} err="failed to get container status \"27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2\": rpc error: code = NotFound desc = could not find container \"27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2\": container with ID starting with 27928ed25981d1d474c76f7d6fa6df74c82204d9cd519ac145fd15fd4a16a2c2 not found: ID does not exist" Apr 16 18:20:41.209048 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.209023 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67cbdb9f8c-86drx"] Apr 16 18:20:41.212190 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.212169 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67cbdb9f8c-86drx"] Apr 16 18:20:41.582274 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:41.582246 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9ca5b0-c6cc-419a-9f78-6f906e41d004" path="/var/lib/kubelet/pods/5f9ca5b0-c6cc-419a-9f78-6f906e41d004/volumes" Apr 16 18:20:44.476892 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.476845 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-rzb5c"] Apr 16 18:20:44.477417 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.477278 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f9ca5b0-c6cc-419a-9f78-6f906e41d004" containerName="registry" Apr 16 18:20:44.477417 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.477296 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ca5b0-c6cc-419a-9f78-6f906e41d004" containerName="registry" Apr 16 18:20:44.477417 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.477402 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f9ca5b0-c6cc-419a-9f78-6f906e41d004" containerName="registry" Apr 16 18:20:44.480836 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.480813 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-rzb5c" Apr 16 18:20:44.483044 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.483014 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:20:44.483162 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.483144 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-v6lzp\"" Apr 16 18:20:44.483206 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.483159 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:20:44.489858 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.489834 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-rzb5c"] Apr 16 18:20:44.590127 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.590098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djjs6\" (UniqueName: \"kubernetes.io/projected/85a99918-c029-45a6-bd5c-b9ed19071944-kube-api-access-djjs6\") pod \"downloads-586b57c7b4-rzb5c\" (UID: \"85a99918-c029-45a6-bd5c-b9ed19071944\") " pod="openshift-console/downloads-586b57c7b4-rzb5c" Apr 16 18:20:44.690528 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.690503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djjs6\" (UniqueName: \"kubernetes.io/projected/85a99918-c029-45a6-bd5c-b9ed19071944-kube-api-access-djjs6\") pod \"downloads-586b57c7b4-rzb5c\" (UID: \"85a99918-c029-45a6-bd5c-b9ed19071944\") " pod="openshift-console/downloads-586b57c7b4-rzb5c" Apr 16 18:20:44.700231 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.700208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djjs6\" (UniqueName: \"kubernetes.io/projected/85a99918-c029-45a6-bd5c-b9ed19071944-kube-api-access-djjs6\") pod \"downloads-586b57c7b4-rzb5c\" (UID: \"85a99918-c029-45a6-bd5c-b9ed19071944\") " pod="openshift-console/downloads-586b57c7b4-rzb5c" Apr 16 18:20:44.790311 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.790250 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-rzb5c" Apr 16 18:20:44.933664 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:44.933635 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-rzb5c"] Apr 16 18:20:44.936613 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:44.936587 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85a99918_c029_45a6_bd5c_b9ed19071944.slice/crio-e4f152c644a0925c794dde90b9b5e4c182eca29fb8c2793874f1e47a7b4e21c0 WatchSource:0}: Error finding container e4f152c644a0925c794dde90b9b5e4c182eca29fb8c2793874f1e47a7b4e21c0: Status 404 returned error can't find the container with id e4f152c644a0925c794dde90b9b5e4c182eca29fb8c2793874f1e47a7b4e21c0 Apr 16 18:20:45.197790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:45.197718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-rzb5c" event={"ID":"85a99918-c029-45a6-bd5c-b9ed19071944","Type":"ContainerStarted","Data":"e4f152c644a0925c794dde90b9b5e4c182eca29fb8c2793874f1e47a7b4e21c0"} Apr 16 18:20:56.007590 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.007560 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5594bb858f-6pkv5"] Apr 16 18:20:56.010900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.010875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.013114 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.013086 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:20:56.013247 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.013118 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:20:56.013247 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.013204 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:20:56.013793 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.013715 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:20:56.013793 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.013752 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:20:56.014001 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.013824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8jw5b\"" Apr 16 18:20:56.023512 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.023493 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5594bb858f-6pkv5"] Apr 16 18:20:56.083298 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.083270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-oauth-serving-cert\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.083420 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.083312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-oauth-config\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.083420 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.083365 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-config\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.083512 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.083441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-service-ca\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.083512 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.083480 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbb78\" (UniqueName: \"kubernetes.io/projected/9aba0206-3c26-423b-8a83-1375fe2dfd35-kube-api-access-zbb78\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.083610 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.083560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-serving-cert\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.184720 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.184693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-oauth-serving-cert\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.184898 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.184733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-oauth-config\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.184898 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.184753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-config\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.184898 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.184787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-service-ca\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.184898 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.184815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbb78\" (UniqueName: \"kubernetes.io/projected/9aba0206-3c26-423b-8a83-1375fe2dfd35-kube-api-access-zbb78\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.185235 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.184981 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-serving-cert\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.185636 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.185613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-service-ca\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.185758 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.185676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-config\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.185820 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.185748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-oauth-serving-cert\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.187409 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.187384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-oauth-config\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.187730 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.187712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-serving-cert\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.197339 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.197316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbb78\" (UniqueName: \"kubernetes.io/projected/9aba0206-3c26-423b-8a83-1375fe2dfd35-kube-api-access-zbb78\") pod \"console-5594bb858f-6pkv5\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:56.323412 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:56.323340 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:20:59.990592 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:20:59.990551 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5594bb858f-6pkv5"] Apr 16 18:20:59.994363 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:20:59.994333 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aba0206_3c26_423b_8a83_1375fe2dfd35.slice/crio-1d3df6f881d466fe40616cbc5ff5bf4180df6c8be1538896a45519eebc33c1b6 WatchSource:0}: Error finding container 1d3df6f881d466fe40616cbc5ff5bf4180df6c8be1538896a45519eebc33c1b6: Status 404 returned error can't find the container with id 1d3df6f881d466fe40616cbc5ff5bf4180df6c8be1538896a45519eebc33c1b6 Apr 16 18:21:00.245320 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:00.245272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-rzb5c" event={"ID":"85a99918-c029-45a6-bd5c-b9ed19071944","Type":"ContainerStarted","Data":"5199bfa34fec2f7b819a73d460ce29be933859b0c5495b51c9e0818c1baf259f"} Apr 16 18:21:00.245642 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:00.245623 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-rzb5c" Apr 16 18:21:00.246665 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:00.246639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5594bb858f-6pkv5" event={"ID":"9aba0206-3c26-423b-8a83-1375fe2dfd35","Type":"ContainerStarted","Data":"1d3df6f881d466fe40616cbc5ff5bf4180df6c8be1538896a45519eebc33c1b6"} Apr 16 18:21:00.264491 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:00.264444 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-rzb5c" podStartSLOduration=1.227923789 podStartE2EDuration="16.264429506s" podCreationTimestamp="2026-04-16 18:20:44 +0000 UTC" firstStartedPulling="2026-04-16 18:20:44.938419564 +0000 UTC m=+217.963462340" lastFinishedPulling="2026-04-16 18:20:59.974925272 +0000 UTC m=+232.999968057" observedRunningTime="2026-04-16 18:21:00.263297284 +0000 UTC m=+233.288340105" watchObservedRunningTime="2026-04-16 18:21:00.264429506 +0000 UTC m=+233.289472303" Apr 16 18:21:00.268961 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:00.268937 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-rzb5c" Apr 16 18:21:05.262891 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:05.262854 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5594bb858f-6pkv5" event={"ID":"9aba0206-3c26-423b-8a83-1375fe2dfd35","Type":"ContainerStarted","Data":"f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281"} Apr 16 18:21:05.287842 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:05.287798 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5594bb858f-6pkv5" podStartSLOduration=5.830775575 podStartE2EDuration="10.287785679s" podCreationTimestamp="2026-04-16 18:20:55 +0000 UTC" firstStartedPulling="2026-04-16 18:20:59.996038391 +0000 UTC m=+233.021081167" lastFinishedPulling="2026-04-16 18:21:04.453048486 +0000 UTC m=+237.478091271" observedRunningTime="2026-04-16 18:21:05.286935845 +0000 UTC m=+238.311978642" watchObservedRunningTime="2026-04-16 18:21:05.287785679 +0000 UTC m=+238.312828499" Apr 16 18:21:06.324054 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:06.324018 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:21:06.324054 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:06.324056 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:21:06.329236 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:06.329212 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:21:07.274063 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:07.274031 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:21:15.418667 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:15.418629 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5594bb858f-6pkv5"] Apr 16 18:21:16.293533 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:16.293505 2572 generic.go:358] "Generic (PLEG): container finished" podID="1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d" containerID="b3706dd42b075a7aa995f4fb0674a3565b2c5e864b7eb2912d080ee35e093ef7" exitCode=0 Apr 16 18:21:16.293645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:16.293556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" event={"ID":"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d","Type":"ContainerDied","Data":"b3706dd42b075a7aa995f4fb0674a3565b2c5e864b7eb2912d080ee35e093ef7"} Apr 16 18:21:16.293851 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:16.293838 2572 scope.go:117] "RemoveContainer" containerID="b3706dd42b075a7aa995f4fb0674a3565b2c5e864b7eb2912d080ee35e093ef7" Apr 16 18:21:17.298072 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:17.298036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xmqgp" event={"ID":"1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d","Type":"ContainerStarted","Data":"65f112517b6d74df4dd9e7e2c661ed36c0755e291cc95372689df2b47f17a04c"} Apr 16 18:21:17.409919 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:17.409896 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xxr9z_dc04a45e-8316-4c0a-86a5-12c986bd0756/dns-node-resolver/0.log" Apr 16 18:21:19.273481 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:19.273444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:21:19.275911 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:19.275886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7-metrics-certs\") pod \"network-metrics-daemon-kn4cv\" (UID: \"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7\") " pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:21:19.381726 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:19.381699 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8x9mw\"" Apr 16 18:21:19.389979 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:19.389960 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kn4cv" Apr 16 18:21:19.506768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:19.506741 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kn4cv"] Apr 16 18:21:19.509819 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:21:19.509794 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a0f68c0_813b_43dc_ae6c_b7aff3eb0ee7.slice/crio-4f4fca46fb9f9a6463779d865b21bafe80373bb400dbe8975013aab8c60eef14 WatchSource:0}: Error finding container 4f4fca46fb9f9a6463779d865b21bafe80373bb400dbe8975013aab8c60eef14: Status 404 returned error can't find the container with id 4f4fca46fb9f9a6463779d865b21bafe80373bb400dbe8975013aab8c60eef14 Apr 16 18:21:20.306431 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:20.306390 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kn4cv" event={"ID":"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7","Type":"ContainerStarted","Data":"4f4fca46fb9f9a6463779d865b21bafe80373bb400dbe8975013aab8c60eef14"} Apr 16 18:21:21.310861 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:21.310821 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kn4cv" event={"ID":"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7","Type":"ContainerStarted","Data":"c9a409e4703fc0c1618f3569bffdff70869dad6b3beed80d7f6e9fea06d69f4b"} Apr 16 18:21:21.311279 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:21.310871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kn4cv" event={"ID":"8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7","Type":"ContainerStarted","Data":"96594dd7639304c1091383b37d83cf8bb2ac995efa509e244ab204ab4bd612cf"} Apr 16 18:21:21.329418 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:21.329299 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kn4cv" podStartSLOduration=253.159047324 podStartE2EDuration="4m14.329279457s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:21:19.511593276 +0000 UTC m=+252.536636055" lastFinishedPulling="2026-04-16 18:21:20.681825395 +0000 UTC m=+253.706868188" observedRunningTime="2026-04-16 18:21:21.328467502 +0000 UTC m=+254.353510300" watchObservedRunningTime="2026-04-16 18:21:21.329279457 +0000 UTC m=+254.354322257" Apr 16 18:21:40.440707 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.440601 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5594bb858f-6pkv5" podUID="9aba0206-3c26-423b-8a83-1375fe2dfd35" containerName="console" containerID="cri-o://f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281" gracePeriod=15 Apr 16 18:21:40.695893 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.695848 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5594bb858f-6pkv5_9aba0206-3c26-423b-8a83-1375fe2dfd35/console/0.log" Apr 16 18:21:40.696009 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.695902 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:21:40.737229 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737205 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-config\") pod \"9aba0206-3c26-423b-8a83-1375fe2dfd35\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " Apr 16 18:21:40.737327 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737233 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-service-ca\") pod \"9aba0206-3c26-423b-8a83-1375fe2dfd35\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " Apr 16 18:21:40.737327 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737282 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-oauth-config\") pod \"9aba0206-3c26-423b-8a83-1375fe2dfd35\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " Apr 16 18:21:40.737327 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737299 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-oauth-serving-cert\") pod \"9aba0206-3c26-423b-8a83-1375fe2dfd35\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " Apr 16 18:21:40.737327 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737322 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbb78\" (UniqueName: \"kubernetes.io/projected/9aba0206-3c26-423b-8a83-1375fe2dfd35-kube-api-access-zbb78\") pod \"9aba0206-3c26-423b-8a83-1375fe2dfd35\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " Apr 16 18:21:40.737485 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737343 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-serving-cert\") pod \"9aba0206-3c26-423b-8a83-1375fe2dfd35\" (UID: \"9aba0206-3c26-423b-8a83-1375fe2dfd35\") " Apr 16 18:21:40.737637 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737611 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-config" (OuterVolumeSpecName: "console-config") pod "9aba0206-3c26-423b-8a83-1375fe2dfd35" (UID: "9aba0206-3c26-423b-8a83-1375fe2dfd35"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:40.737731 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737685 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9aba0206-3c26-423b-8a83-1375fe2dfd35" (UID: "9aba0206-3c26-423b-8a83-1375fe2dfd35"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:40.737731 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.737689 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-service-ca" (OuterVolumeSpecName: "service-ca") pod "9aba0206-3c26-423b-8a83-1375fe2dfd35" (UID: "9aba0206-3c26-423b-8a83-1375fe2dfd35"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:40.739398 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.739372 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9aba0206-3c26-423b-8a83-1375fe2dfd35" (UID: "9aba0206-3c26-423b-8a83-1375fe2dfd35"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:40.739506 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.739437 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9aba0206-3c26-423b-8a83-1375fe2dfd35" (UID: "9aba0206-3c26-423b-8a83-1375fe2dfd35"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:40.739506 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.739475 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aba0206-3c26-423b-8a83-1375fe2dfd35-kube-api-access-zbb78" (OuterVolumeSpecName: "kube-api-access-zbb78") pod "9aba0206-3c26-423b-8a83-1375fe2dfd35" (UID: "9aba0206-3c26-423b-8a83-1375fe2dfd35"). InnerVolumeSpecName "kube-api-access-zbb78". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:40.838808 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.838786 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-oauth-config\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:40.838808 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.838806 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-oauth-serving-cert\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:40.838922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.838817 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbb78\" (UniqueName: \"kubernetes.io/projected/9aba0206-3c26-423b-8a83-1375fe2dfd35-kube-api-access-zbb78\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:40.838922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.838826 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-serving-cert\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:40.838922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.838836 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-console-config\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:40.838922 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:40.838844 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aba0206-3c26-423b-8a83-1375fe2dfd35-service-ca\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:41.367175 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.367152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5594bb858f-6pkv5_9aba0206-3c26-423b-8a83-1375fe2dfd35/console/0.log" Apr 16 18:21:41.367289 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.367192 2572 generic.go:358] "Generic (PLEG): container finished" podID="9aba0206-3c26-423b-8a83-1375fe2dfd35" containerID="f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281" exitCode=2 Apr 16 18:21:41.367289 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.367222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5594bb858f-6pkv5" event={"ID":"9aba0206-3c26-423b-8a83-1375fe2dfd35","Type":"ContainerDied","Data":"f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281"} Apr 16 18:21:41.367289 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.367246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5594bb858f-6pkv5" event={"ID":"9aba0206-3c26-423b-8a83-1375fe2dfd35","Type":"ContainerDied","Data":"1d3df6f881d466fe40616cbc5ff5bf4180df6c8be1538896a45519eebc33c1b6"} Apr 16 18:21:41.367289 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.367262 2572 scope.go:117] "RemoveContainer" containerID="f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281" Apr 16 18:21:41.367289 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.367271 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5594bb858f-6pkv5" Apr 16 18:21:41.375565 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.375547 2572 scope.go:117] "RemoveContainer" containerID="f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281" Apr 16 18:21:41.375854 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:41.375832 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281\": container with ID starting with f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281 not found: ID does not exist" containerID="f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281" Apr 16 18:21:41.375944 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.375860 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281"} err="failed to get container status \"f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281\": rpc error: code = NotFound desc = could not find container \"f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281\": container with ID starting with f52e18354c30e5d5d6b26a421c2248da4dae39c9ff20fd68a065888bdaf47281 not found: ID does not exist" Apr 16 18:21:41.388980 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.388957 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5594bb858f-6pkv5"] Apr 16 18:21:41.395732 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.395707 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5594bb858f-6pkv5"] Apr 16 18:21:41.582742 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:41.582717 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aba0206-3c26-423b-8a83-1375fe2dfd35" path="/var/lib/kubelet/pods/9aba0206-3c26-423b-8a83-1375fe2dfd35/volumes" Apr 16 18:21:46.018241 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:46.018189 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dj5sz" podUID="528ce5b9-64df-485a-8269-fb5ba8dc8ba5" Apr 16 18:21:46.383479 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:46.383393 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj5sz" Apr 16 18:21:49.296615 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.296579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:21:49.297013 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.296640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:21:49.299246 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.299213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/528ce5b9-64df-485a-8269-fb5ba8dc8ba5-metrics-tls\") pod \"dns-default-dj5sz\" (UID: \"528ce5b9-64df-485a-8269-fb5ba8dc8ba5\") " pod="openshift-dns/dns-default-dj5sz" Apr 16 18:21:49.299359 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.299304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5226006a-a858-4a19-a1d5-544f65d3a882-cert\") pod \"ingress-canary-9c9hd\" (UID: \"5226006a-a858-4a19-a1d5-544f65d3a882\") " pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:21:49.381727 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.381698 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v2cgj\"" Apr 16 18:21:49.386859 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.386841 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m4v2s\"" Apr 16 18:21:49.389476 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.389459 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9c9hd" Apr 16 18:21:49.395588 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.395563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj5sz" Apr 16 18:21:49.508203 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.508174 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9c9hd"] Apr 16 18:21:49.511394 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:21:49.511366 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5226006a_a858_4a19_a1d5_544f65d3a882.slice/crio-1ab0f2acd2d8440ca43d097ac466071474af782dd06294374c9ec29ab81faf06 WatchSource:0}: Error finding container 1ab0f2acd2d8440ca43d097ac466071474af782dd06294374c9ec29ab81faf06: Status 404 returned error can't find the container with id 1ab0f2acd2d8440ca43d097ac466071474af782dd06294374c9ec29ab81faf06 Apr 16 18:21:49.527094 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:49.527070 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dj5sz"] Apr 16 18:21:49.530118 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:21:49.530098 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528ce5b9_64df_485a_8269_fb5ba8dc8ba5.slice/crio-b0cca9e27eae377ec78985b0014df30d9b7650dcf850e90843b6812cd56f0e19 WatchSource:0}: Error finding container b0cca9e27eae377ec78985b0014df30d9b7650dcf850e90843b6812cd56f0e19: Status 404 returned error can't find the container with id b0cca9e27eae377ec78985b0014df30d9b7650dcf850e90843b6812cd56f0e19 Apr 16 18:21:50.397708 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.397660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dj5sz" event={"ID":"528ce5b9-64df-485a-8269-fb5ba8dc8ba5","Type":"ContainerStarted","Data":"b0cca9e27eae377ec78985b0014df30d9b7650dcf850e90843b6812cd56f0e19"} Apr 16 18:21:50.398866 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.398824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9c9hd" event={"ID":"5226006a-a858-4a19-a1d5-544f65d3a882","Type":"ContainerStarted","Data":"1ab0f2acd2d8440ca43d097ac466071474af782dd06294374c9ec29ab81faf06"} Apr 16 18:21:50.837311 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.837278 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:21:50.838046 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.837786 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="alertmanager" containerID="cri-o://eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70" gracePeriod=120 Apr 16 18:21:50.838046 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.837863 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-metric" containerID="cri-o://264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418" gracePeriod=120 Apr 16 18:21:50.838046 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.837892 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="prom-label-proxy" containerID="cri-o://1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff" gracePeriod=120 Apr 16 18:21:50.838046 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.837921 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="config-reloader" containerID="cri-o://d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a" gracePeriod=120 Apr 16 18:21:50.838046 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.837902 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-web" containerID="cri-o://fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0" gracePeriod=120 Apr 16 18:21:50.838046 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.838011 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy" containerID="cri-o://7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636" gracePeriod=120 Apr 16 18:21:50.841283 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.841257 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66489c6f5f-sjrkg"] Apr 16 18:21:50.842638 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.842611 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aba0206-3c26-423b-8a83-1375fe2dfd35" containerName="console" Apr 16 18:21:50.842638 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.842634 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aba0206-3c26-423b-8a83-1375fe2dfd35" containerName="console" Apr 16 18:21:50.842890 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.842742 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9aba0206-3c26-423b-8a83-1375fe2dfd35" containerName="console" Apr 16 18:21:50.846553 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.846534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:50.849087 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.848883 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8jw5b\"" Apr 16 18:21:50.849939 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.849918 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:21:50.850052 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.850000 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:21:50.850357 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.850330 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:21:50.851286 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.851246 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:21:50.851525 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.851504 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:21:50.860106 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.860059 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66489c6f5f-sjrkg"] Apr 16 18:21:50.864730 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.864702 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:21:50.907659 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.907634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnj29\" (UniqueName: \"kubernetes.io/projected/648f464e-2822-4169-8516-1f3382ca3ba0-kube-api-access-hnj29\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:50.907767 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.907675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-service-ca\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:50.907767 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.907720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-trusted-ca-bundle\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:50.907767 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.907737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-oauth-config\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:50.907900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.907813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-console-config\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:50.907900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.907888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-serving-cert\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:50.907984 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:50.907911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-oauth-serving-cert\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.008925 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.008885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnj29\" (UniqueName: \"kubernetes.io/projected/648f464e-2822-4169-8516-1f3382ca3ba0-kube-api-access-hnj29\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.008925 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.008932 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-service-ca\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.009117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.008974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-trusted-ca-bundle\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.009117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.009012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-oauth-config\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.009117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.009073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-console-config\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.009263 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.009141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-serving-cert\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.009263 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.009172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-oauth-serving-cert\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.009862 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.009832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-oauth-serving-cert\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.009862 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.009857 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-console-config\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.010054 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.009913 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-trusted-ca-bundle\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.010114 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.010093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-service-ca\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.011744 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.011722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-oauth-config\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.011946 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.011920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-serving-cert\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.017949 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.017926 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnj29\" (UniqueName: \"kubernetes.io/projected/648f464e-2822-4169-8516-1f3382ca3ba0-kube-api-access-hnj29\") pod \"console-66489c6f5f-sjrkg\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.167125 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.167050 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:21:51.404815 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404786 2572 generic.go:358] "Generic (PLEG): container finished" podID="e172067d-0626-43f2-a977-f587273b6e98" containerID="1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff" exitCode=0 Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404809 2572 generic.go:358] "Generic (PLEG): container finished" podID="e172067d-0626-43f2-a977-f587273b6e98" containerID="264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418" exitCode=0 Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404831 2572 generic.go:358] "Generic (PLEG): container finished" podID="e172067d-0626-43f2-a977-f587273b6e98" containerID="7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636" exitCode=0 Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404841 2572 generic.go:358] "Generic (PLEG): container finished" podID="e172067d-0626-43f2-a977-f587273b6e98" containerID="d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a" exitCode=0 Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404849 2572 generic.go:358] "Generic (PLEG): container finished" podID="e172067d-0626-43f2-a977-f587273b6e98" containerID="eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70" exitCode=0 Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff"} Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404884 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418"} Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404898 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636"} Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a"} Apr 16 18:21:51.405199 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.404922 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70"} Apr 16 18:21:51.406348 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.406325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9c9hd" event={"ID":"5226006a-a858-4a19-a1d5-544f65d3a882","Type":"ContainerStarted","Data":"682fc9fbe61b9b79d4727a98f859cc1577f90688f47c50a3e90df70763e7e55a"} Apr 16 18:21:51.425013 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.424927 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9c9hd" podStartSLOduration=250.643238327 podStartE2EDuration="4m12.424906915s" podCreationTimestamp="2026-04-16 18:17:39 +0000 UTC" firstStartedPulling="2026-04-16 18:21:49.513197247 +0000 UTC m=+282.538240025" lastFinishedPulling="2026-04-16 18:21:51.294865826 +0000 UTC m=+284.319908613" observedRunningTime="2026-04-16 18:21:51.424292321 +0000 UTC m=+284.449335120" watchObservedRunningTime="2026-04-16 18:21:51.424906915 +0000 UTC m=+284.449949795" Apr 16 18:21:51.430616 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:51.430588 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66489c6f5f-sjrkg"] Apr 16 18:21:51.432180 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:21:51.432157 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod648f464e_2822_4169_8516_1f3382ca3ba0.slice/crio-012cbda0c6b189df0e72b048a25f65f4640faa37c223db495de0c397b041c71b WatchSource:0}: Error finding container 012cbda0c6b189df0e72b048a25f65f4640faa37c223db495de0c397b041c71b: Status 404 returned error can't find the container with id 012cbda0c6b189df0e72b048a25f65f4640faa37c223db495de0c397b041c71b Apr 16 18:21:52.061082 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.061063 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.117744 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117714 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm85l\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-kube-api-access-cm85l\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.117883 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117748 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-trusted-ca-bundle\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.117883 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117771 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.117883 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-cluster-tls-config\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.117883 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117830 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-metric\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.117883 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117872 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-main-db\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117897 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117941 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-metrics-client-ca\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.117971 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-tls-assets\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.118022 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-web\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.118048 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-config-volume\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.118076 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-config-out\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.118115 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-web-config\") pod \"e172067d-0626-43f2-a977-f587273b6e98\" (UID: \"e172067d-0626-43f2-a977-f587273b6e98\") " Apr 16 18:21:52.118177 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.118147 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:52.118525 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.118343 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.119420 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.118874 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:52.119420 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.119146 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:52.120924 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.120875 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-kube-api-access-cm85l" (OuterVolumeSpecName: "kube-api-access-cm85l") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "kube-api-access-cm85l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:52.120924 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.120874 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:52.121315 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.121268 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-config-out" (OuterVolumeSpecName: "config-out") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:52.121581 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.121554 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:52.121905 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.121840 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-config-volume" (OuterVolumeSpecName: "config-volume") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:52.122214 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.122190 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:52.122498 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.122480 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:52.122853 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.122838 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:52.124848 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.124818 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:52.130566 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.130547 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-web-config" (OuterVolumeSpecName: "web-config") pod "e172067d-0626-43f2-a977-f587273b6e98" (UID: "e172067d-0626-43f2-a977-f587273b6e98"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:52.219035 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219015 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e172067d-0626-43f2-a977-f587273b6e98-metrics-client-ca\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219036 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-tls-assets\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219047 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219056 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-config-volume\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219064 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-config-out\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219072 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-web-config\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219082 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cm85l\" (UniqueName: \"kubernetes.io/projected/e172067d-0626-43f2-a977-f587273b6e98-kube-api-access-cm85l\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219090 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-main-tls\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219098 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-cluster-tls-config\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219107 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219117 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e172067d-0626-43f2-a977-f587273b6e98-alertmanager-main-db\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.219393 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.219126 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e172067d-0626-43f2-a977-f587273b6e98-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:21:52.411532 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.411508 2572 generic.go:358] "Generic (PLEG): container finished" podID="e172067d-0626-43f2-a977-f587273b6e98" containerID="fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0" exitCode=0 Apr 16 18:21:52.411851 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.411575 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0"} Apr 16 18:21:52.411851 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.411608 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.411851 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.411615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e172067d-0626-43f2-a977-f587273b6e98","Type":"ContainerDied","Data":"350c85e888b4520d9c1c9f7b88f261a8b47c44483101af795f043438811d9b41"} Apr 16 18:21:52.411851 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.411639 2572 scope.go:117] "RemoveContainer" containerID="1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff" Apr 16 18:21:52.413220 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.413195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66489c6f5f-sjrkg" event={"ID":"648f464e-2822-4169-8516-1f3382ca3ba0","Type":"ContainerStarted","Data":"7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa"} Apr 16 18:21:52.413312 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.413230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66489c6f5f-sjrkg" event={"ID":"648f464e-2822-4169-8516-1f3382ca3ba0","Type":"ContainerStarted","Data":"012cbda0c6b189df0e72b048a25f65f4640faa37c223db495de0c397b041c71b"} Apr 16 18:21:52.415041 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.415019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dj5sz" event={"ID":"528ce5b9-64df-485a-8269-fb5ba8dc8ba5","Type":"ContainerStarted","Data":"7e72f25cef8aa69f0eb501669abe7c4fb5681b487b5f2dcd06c655f5259aaaed"} Apr 16 18:21:52.415131 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.415050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dj5sz" event={"ID":"528ce5b9-64df-485a-8269-fb5ba8dc8ba5","Type":"ContainerStarted","Data":"65ac66b513a16e2af80a9670b7314f6c96a60b5f9b6bc4cfa6050c256eb70fe4"} Apr 16 18:21:52.418828 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.418812 2572 scope.go:117] "RemoveContainer" containerID="264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418" Apr 16 18:21:52.424771 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.424756 2572 scope.go:117] "RemoveContainer" containerID="7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636" Apr 16 18:21:52.430310 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.430292 2572 scope.go:117] "RemoveContainer" containerID="fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0" Apr 16 18:21:52.435492 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.435296 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66489c6f5f-sjrkg" podStartSLOduration=2.435284034 podStartE2EDuration="2.435284034s" podCreationTimestamp="2026-04-16 18:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:52.434439737 +0000 UTC m=+285.459482535" watchObservedRunningTime="2026-04-16 18:21:52.435284034 +0000 UTC m=+285.460326832" Apr 16 18:21:52.436299 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.436281 2572 scope.go:117] "RemoveContainer" containerID="d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a" Apr 16 18:21:52.442258 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.442243 2572 scope.go:117] "RemoveContainer" containerID="eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70" Apr 16 18:21:52.448008 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.447961 2572 scope.go:117] "RemoveContainer" containerID="cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41" Apr 16 18:21:52.452340 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.452298 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dj5sz" podStartSLOduration=251.6914487 podStartE2EDuration="4m13.452284898s" podCreationTimestamp="2026-04-16 18:17:39 +0000 UTC" firstStartedPulling="2026-04-16 18:21:49.531710389 +0000 UTC m=+282.556753165" lastFinishedPulling="2026-04-16 18:21:51.292546575 +0000 UTC m=+284.317589363" observedRunningTime="2026-04-16 18:21:52.45103022 +0000 UTC m=+285.476073018" watchObservedRunningTime="2026-04-16 18:21:52.452284898 +0000 UTC m=+285.477327696" Apr 16 18:21:52.453880 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.453864 2572 scope.go:117] "RemoveContainer" containerID="1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff" Apr 16 18:21:52.454111 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:52.454095 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff\": container with ID starting with 1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff not found: ID does not exist" containerID="1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff" Apr 16 18:21:52.454168 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454116 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff"} err="failed to get container status \"1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff\": rpc error: code = NotFound desc = could not find container \"1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff\": container with ID starting with 1a93f42b45da48687a791da69a265bde5433513d4166795ecaa79f0e4c4222ff not found: ID does not exist" Apr 16 18:21:52.454168 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454130 2572 scope.go:117] "RemoveContainer" containerID="264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418" Apr 16 18:21:52.454346 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:52.454327 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418\": container with ID starting with 264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418 not found: ID does not exist" containerID="264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418" Apr 16 18:21:52.454407 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454351 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418"} err="failed to get container status \"264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418\": rpc error: code = NotFound desc = could not find container \"264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418\": container with ID starting with 264fd8665a138d354c95aec0f5fe97afaf15f0d3664441f992be8e8541852418 not found: ID does not exist" Apr 16 18:21:52.454407 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454363 2572 scope.go:117] "RemoveContainer" containerID="7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636" Apr 16 18:21:52.454606 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:52.454588 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636\": container with ID starting with 7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636 not found: ID does not exist" containerID="7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636" Apr 16 18:21:52.454645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454613 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636"} err="failed to get container status \"7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636\": rpc error: code = NotFound desc = could not find container \"7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636\": container with ID starting with 7ae2dde5d9996f7447364e5c74dfad807fc5b84bb13c8184b992e0838cad5636 not found: ID does not exist" Apr 16 18:21:52.454645 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454635 2572 scope.go:117] "RemoveContainer" containerID="fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0" Apr 16 18:21:52.454860 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:52.454835 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0\": container with ID starting with fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0 not found: ID does not exist" containerID="fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0" Apr 16 18:21:52.454895 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454863 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0"} err="failed to get container status \"fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0\": rpc error: code = NotFound desc = could not find container \"fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0\": container with ID starting with fa229cd93714c481378e59e9f818bb3a4d78394b85defda33eb06f521db365e0 not found: ID does not exist" Apr 16 18:21:52.454895 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.454875 2572 scope.go:117] "RemoveContainer" containerID="d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a" Apr 16 18:21:52.455148 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:52.455122 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a\": container with ID starting with d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a not found: ID does not exist" containerID="d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a" Apr 16 18:21:52.455218 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.455151 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a"} err="failed to get container status \"d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a\": rpc error: code = NotFound desc = could not find container \"d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a\": container with ID starting with d70b0f2da3e6073f4c6a8e23f17985d46e6d4605f11fa1003d4d3f580fc79a1a not found: ID does not exist" Apr 16 18:21:52.455218 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.455165 2572 scope.go:117] "RemoveContainer" containerID="eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70" Apr 16 18:21:52.455397 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:52.455375 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70\": container with ID starting with eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70 not found: ID does not exist" containerID="eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70" Apr 16 18:21:52.455437 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.455404 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70"} err="failed to get container status \"eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70\": rpc error: code = NotFound desc = could not find container \"eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70\": container with ID starting with eb978a162f9d41176247680e77ba72ea3b6a11fda33045845e1bbc9f7c25cb70 not found: ID does not exist" Apr 16 18:21:52.455437 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.455422 2572 scope.go:117] "RemoveContainer" containerID="cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41" Apr 16 18:21:52.455635 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:21:52.455616 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41\": container with ID starting with cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41 not found: ID does not exist" containerID="cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41" Apr 16 18:21:52.455715 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.455639 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41"} err="failed to get container status \"cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41\": rpc error: code = NotFound desc = could not find container \"cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41\": container with ID starting with cc0b667fa20e8fc161c93272433663bba5cc7748336b231f4c41c1f5f3c9ae41 not found: ID does not exist" Apr 16 18:21:52.465514 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.465493 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:21:52.470750 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.470702 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:21:52.507303 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507281 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:21:52.507538 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507522 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="config-reloader" Apr 16 18:21:52.507614 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507541 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="config-reloader" Apr 16 18:21:52.507614 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507552 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy" Apr 16 18:21:52.507614 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507561 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy" Apr 16 18:21:52.507614 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507571 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-metric" Apr 16 18:21:52.507614 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507579 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-metric" Apr 16 18:21:52.507614 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507592 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-web" Apr 16 18:21:52.507614 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507600 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-web" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507620 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="init-config-reloader" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507628 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="init-config-reloader" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507637 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="prom-label-proxy" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507645 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="prom-label-proxy" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507659 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="alertmanager" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507666 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="alertmanager" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507721 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="alertmanager" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507733 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="prom-label-proxy" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507743 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="config-reloader" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507753 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-web" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507763 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy-metric" Apr 16 18:21:52.507900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.507772 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e172067d-0626-43f2-a977-f587273b6e98" containerName="kube-rbac-proxy" Apr 16 18:21:52.513833 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.513815 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.516164 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516139 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:21:52.516245 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516153 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:21:52.516245 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516236 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-659xp\"" Apr 16 18:21:52.516345 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516185 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:21:52.516555 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516534 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:21:52.516555 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516547 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:21:52.516704 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516551 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:21:52.516704 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516697 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:21:52.516797 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.516782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:21:52.521733 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.521715 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:21:52.524143 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.524124 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:21:52.621377 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621451 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621451 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-web-config\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621604 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621583 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621671 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621714 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-config-volume\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621750 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-config-out\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621804 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621786 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621834 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621865 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx58h\" (UniqueName: \"kubernetes.io/projected/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-kube-api-access-zx58h\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621898 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621886 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621930 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.621962 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.621935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.722884 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.722836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.722884 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.722867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723021 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.722886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-web-config\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723021 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.722904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723021 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.722921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723171 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-config-volume\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723171 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-config-out\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723171 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723171 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723350 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx58h\" (UniqueName: \"kubernetes.io/projected/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-kube-api-access-zx58h\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723350 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723350 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723350 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.723903 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723879 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.724068 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.723941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.725917 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.725874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.725917 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.725905 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-config-volume\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.726072 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.725905 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-web-config\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.726294 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.726239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.726294 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.726244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.726417 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.726333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.726523 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.726504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-config-out\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.726642 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.726621 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.726844 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.726829 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.727852 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.727836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.732736 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.732717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx58h\" (UniqueName: \"kubernetes.io/projected/9daafdb0-a23f-458e-b3bf-c2f20a0fbd12-kube-api-access-zx58h\") pod \"alertmanager-main-0\" (UID: \"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.824314 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.824288 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:21:52.957703 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:52.957651 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:21:52.960551 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:21:52.960526 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9daafdb0_a23f_458e_b3bf_c2f20a0fbd12.slice/crio-e71c2d4c1aca5479d80762caa0dd8f988d8333598af32681f19e2fe67014cfa2 WatchSource:0}: Error finding container e71c2d4c1aca5479d80762caa0dd8f988d8333598af32681f19e2fe67014cfa2: Status 404 returned error can't find the container with id e71c2d4c1aca5479d80762caa0dd8f988d8333598af32681f19e2fe67014cfa2 Apr 16 18:21:53.419418 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:53.419355 2572 generic.go:358] "Generic (PLEG): container finished" podID="9daafdb0-a23f-458e-b3bf-c2f20a0fbd12" containerID="8a1a87a0f1a30913dd77ea6abaf9467c3ee05fceb3874d9b61b3a0f79b32291e" exitCode=0 Apr 16 18:21:53.419768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:53.419446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerDied","Data":"8a1a87a0f1a30913dd77ea6abaf9467c3ee05fceb3874d9b61b3a0f79b32291e"} Apr 16 18:21:53.419768 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:53.419493 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerStarted","Data":"e71c2d4c1aca5479d80762caa0dd8f988d8333598af32681f19e2fe67014cfa2"} Apr 16 18:21:53.420915 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:53.420892 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dj5sz" Apr 16 18:21:53.581722 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:53.581696 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e172067d-0626-43f2-a977-f587273b6e98" path="/var/lib/kubelet/pods/e172067d-0626-43f2-a977-f587273b6e98/volumes" Apr 16 18:21:54.427061 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:54.427023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerStarted","Data":"11daec17de13c4b82ee4d3948d932cc49804216cea51dc9dc98944d20818e2e4"} Apr 16 18:21:54.427061 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:54.427065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerStarted","Data":"9bb590b690c6370b68c5af49972402b7dc68af1d02fa074ee0aa6bbb419eeeba"} Apr 16 18:21:54.427497 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:54.427075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerStarted","Data":"ee0a12f1c5c65d966e3563df56f8d351eb92b993ab7ad44a3c5eb25b51154b8c"} Apr 16 18:21:54.427497 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:54.427084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerStarted","Data":"918c5664b3c3c6e74f3b45491ef156a0247a55f00aa666c065301a1f5755e900"} Apr 16 18:21:54.427497 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:54.427092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerStarted","Data":"58fe0d89340fe87e718416de675305a5878c419d7b50f6a6fc7dc06368fbde1e"} Apr 16 18:21:54.427497 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:54.427101 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9daafdb0-a23f-458e-b3bf-c2f20a0fbd12","Type":"ContainerStarted","Data":"14defe121fb166879002b93c88d9cfbccb0262bebacf99707a6d9344f2cf34a7"} Apr 16 18:21:54.456628 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:21:54.456576 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.456561054 podStartE2EDuration="2.456561054s" podCreationTimestamp="2026-04-16 18:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:54.454874108 +0000 UTC m=+287.479916939" watchObservedRunningTime="2026-04-16 18:21:54.456561054 +0000 UTC m=+287.481603848" Apr 16 18:22:01.168031 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:01.167983 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:22:01.168031 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:01.168040 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:22:01.172796 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:01.172773 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:22:01.452098 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:01.452023 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:22:03.429227 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:03.429200 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dj5sz" Apr 16 18:22:07.438418 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:07.438297 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:22:07.438814 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:07.438798 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:22:07.441208 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:22:07.441112 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:25:55.913213 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:55.913180 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm"] Apr 16 18:25:55.916500 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:55.916483 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:55.918938 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:55.918919 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dftd2\"" Apr 16 18:25:55.919303 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:55.919286 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:25:55.919572 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:55.919555 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:25:55.926251 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:55.926227 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm"] Apr 16 18:25:56.042441 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.042422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.042536 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.042452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.042536 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.042474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gddn\" (UniqueName: \"kubernetes.io/projected/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-kube-api-access-6gddn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.143293 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.143268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.143379 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.143303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.143379 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.143323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gddn\" (UniqueName: \"kubernetes.io/projected/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-kube-api-access-6gddn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.143596 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.143579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.143660 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.143644 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.154160 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.154135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gddn\" (UniqueName: \"kubernetes.io/projected/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-kube-api-access-6gddn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.225699 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.225682 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:25:56.338578 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.338554 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm"] Apr 16 18:25:56.340414 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:25:56.340389 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892bb7d3_a9b6_4343_ba6f_9f1608a7b5da.slice/crio-e5471ac910a7dc55efc4ad05ebb2c9e7d2bd327e17b743ef77e9c46a5428aabe WatchSource:0}: Error finding container e5471ac910a7dc55efc4ad05ebb2c9e7d2bd327e17b743ef77e9c46a5428aabe: Status 404 returned error can't find the container with id e5471ac910a7dc55efc4ad05ebb2c9e7d2bd327e17b743ef77e9c46a5428aabe Apr 16 18:25:56.342196 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:56.342183 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:25:57.046495 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:25:57.046449 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" event={"ID":"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da","Type":"ContainerStarted","Data":"e5471ac910a7dc55efc4ad05ebb2c9e7d2bd327e17b743ef77e9c46a5428aabe"} Apr 16 18:26:01.059057 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:01.059032 2572 generic.go:358] "Generic (PLEG): container finished" podID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerID="cab13ed9c293e913427d858cb89b9df69c3cc8d69cc644dbce2f89e92c14a17c" exitCode=0 Apr 16 18:26:01.059349 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:01.059069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" event={"ID":"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da","Type":"ContainerDied","Data":"cab13ed9c293e913427d858cb89b9df69c3cc8d69cc644dbce2f89e92c14a17c"} Apr 16 18:26:04.069001 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:04.068952 2572 generic.go:358] "Generic (PLEG): container finished" podID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerID="ce96e55c22e5691649cdaed3dbd6b225ac33d068aeaf968515f26b6d46d76fec" exitCode=0 Apr 16 18:26:04.069375 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:04.069038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" event={"ID":"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da","Type":"ContainerDied","Data":"ce96e55c22e5691649cdaed3dbd6b225ac33d068aeaf968515f26b6d46d76fec"} Apr 16 18:26:07.234890 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:07.234862 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66489c6f5f-sjrkg"] Apr 16 18:26:10.088862 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:10.088821 2572 generic.go:358] "Generic (PLEG): container finished" podID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerID="25860990c49c53fd3b92ecedb3a1f486fccf10233dec4bcf56b675ed00882560" exitCode=0 Apr 16 18:26:10.088862 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:10.088861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" event={"ID":"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da","Type":"ContainerDied","Data":"25860990c49c53fd3b92ecedb3a1f486fccf10233dec4bcf56b675ed00882560"} Apr 16 18:26:11.202682 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.202657 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:26:11.377711 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.377627 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-util\") pod \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " Apr 16 18:26:11.377878 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.377722 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-bundle\") pod \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " Apr 16 18:26:11.377878 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.377767 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gddn\" (UniqueName: \"kubernetes.io/projected/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-kube-api-access-6gddn\") pod \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\" (UID: \"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da\") " Apr 16 18:26:11.378274 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.378248 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-bundle" (OuterVolumeSpecName: "bundle") pod "892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" (UID: "892bb7d3-a9b6-4343-ba6f-9f1608a7b5da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:11.379867 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.379845 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-kube-api-access-6gddn" (OuterVolumeSpecName: "kube-api-access-6gddn") pod "892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" (UID: "892bb7d3-a9b6-4343-ba6f-9f1608a7b5da"). InnerVolumeSpecName "kube-api-access-6gddn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:11.381485 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.381464 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-util" (OuterVolumeSpecName: "util") pod "892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" (UID: "892bb7d3-a9b6-4343-ba6f-9f1608a7b5da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:11.479192 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.479165 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-util\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:11.479192 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.479186 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-bundle\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:11.479192 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:11.479195 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6gddn\" (UniqueName: \"kubernetes.io/projected/892bb7d3-a9b6-4343-ba6f-9f1608a7b5da-kube-api-access-6gddn\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:12.095342 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:12.095308 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" event={"ID":"892bb7d3-a9b6-4343-ba6f-9f1608a7b5da","Type":"ContainerDied","Data":"e5471ac910a7dc55efc4ad05ebb2c9e7d2bd327e17b743ef77e9c46a5428aabe"} Apr 16 18:26:12.095342 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:12.095345 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5471ac910a7dc55efc4ad05ebb2c9e7d2bd327e17b743ef77e9c46a5428aabe" Apr 16 18:26:12.095536 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:12.095359 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn6nqm" Apr 16 18:26:16.839696 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.839657 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8"] Apr 16 18:26:16.840102 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.839979 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerName="util" Apr 16 18:26:16.840102 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.840005 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerName="util" Apr 16 18:26:16.840102 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.840034 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerName="pull" Apr 16 18:26:16.840102 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.840039 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerName="pull" Apr 16 18:26:16.840102 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.840046 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerName="extract" Apr 16 18:26:16.840102 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.840052 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerName="extract" Apr 16 18:26:16.840291 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.840109 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="892bb7d3-a9b6-4343-ba6f-9f1608a7b5da" containerName="extract" Apr 16 18:26:16.847455 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.847437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:16.849704 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.849684 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:26:16.849794 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.849723 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:26:16.850858 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.850830 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:26:16.850858 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.850835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:26:16.850981 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.850869 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fr8cv\"" Apr 16 18:26:16.854575 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.854557 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt"] Apr 16 18:26:16.861253 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.861235 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8"] Apr 16 18:26:16.861334 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.861317 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:16.863276 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.863257 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:26:16.866385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:16.866366 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt"] Apr 16 18:26:17.018566 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.018537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-tmp\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.018566 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.018569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-462pq\" (UniqueName: \"kubernetes.io/projected/65ab0e40-5e3f-492a-b381-ba30127be76e-kube-api-access-462pq\") pod \"managed-serviceaccount-addon-agent-76b7f47c59-smhs8\" (UID: \"65ab0e40-5e3f-492a-b381-ba30127be76e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:17.018736 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.018617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-klusterlet-config\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.018736 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.018639 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/65ab0e40-5e3f-492a-b381-ba30127be76e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76b7f47c59-smhs8\" (UID: \"65ab0e40-5e3f-492a-b381-ba30127be76e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:17.018736 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.018663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96fr\" (UniqueName: \"kubernetes.io/projected/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-kube-api-access-f96fr\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.119521 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.119450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-klusterlet-config\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.119521 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.119483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/65ab0e40-5e3f-492a-b381-ba30127be76e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76b7f47c59-smhs8\" (UID: \"65ab0e40-5e3f-492a-b381-ba30127be76e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:17.119684 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.119606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f96fr\" (UniqueName: \"kubernetes.io/projected/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-kube-api-access-f96fr\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.119722 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.119687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-tmp\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.119722 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.119712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-462pq\" (UniqueName: \"kubernetes.io/projected/65ab0e40-5e3f-492a-b381-ba30127be76e-kube-api-access-462pq\") pod \"managed-serviceaccount-addon-agent-76b7f47c59-smhs8\" (UID: \"65ab0e40-5e3f-492a-b381-ba30127be76e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:17.120081 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.120057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-tmp\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.121854 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.121834 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/65ab0e40-5e3f-492a-b381-ba30127be76e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76b7f47c59-smhs8\" (UID: \"65ab0e40-5e3f-492a-b381-ba30127be76e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:17.122110 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.122091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-klusterlet-config\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.135902 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.135874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-462pq\" (UniqueName: \"kubernetes.io/projected/65ab0e40-5e3f-492a-b381-ba30127be76e-kube-api-access-462pq\") pod \"managed-serviceaccount-addon-agent-76b7f47c59-smhs8\" (UID: \"65ab0e40-5e3f-492a-b381-ba30127be76e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:17.136008 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.135952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96fr\" (UniqueName: \"kubernetes.io/projected/a75f1024-a77d-4efb-a77d-e1cf129f4f0d-kube-api-access-f96fr\") pod \"klusterlet-addon-workmgr-64c9f7c49f-4j4bt\" (UID: \"a75f1024-a77d-4efb-a77d-e1cf129f4f0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.168468 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.168445 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" Apr 16 18:26:17.175117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.175099 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:17.330683 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.330628 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8"] Apr 16 18:26:17.332785 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:17.332764 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt"] Apr 16 18:26:17.345747 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:26:17.345722 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75f1024_a77d_4efb_a77d_e1cf129f4f0d.slice/crio-884ef7370efbea6b7cd2538a0c8352e73e92c976bbdd33a1ed30531cf4157ca4 WatchSource:0}: Error finding container 884ef7370efbea6b7cd2538a0c8352e73e92c976bbdd33a1ed30531cf4157ca4: Status 404 returned error can't find the container with id 884ef7370efbea6b7cd2538a0c8352e73e92c976bbdd33a1ed30531cf4157ca4 Apr 16 18:26:17.346156 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:26:17.346127 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ab0e40_5e3f_492a_b381_ba30127be76e.slice/crio-4b8fe3fda95fd2146d7db9d125f714e9ceefb4386c695d60f276ca5d1656f552 WatchSource:0}: Error finding container 4b8fe3fda95fd2146d7db9d125f714e9ceefb4386c695d60f276ca5d1656f552: Status 404 returned error can't find the container with id 4b8fe3fda95fd2146d7db9d125f714e9ceefb4386c695d60f276ca5d1656f552 Apr 16 18:26:18.110585 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.110551 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" event={"ID":"a75f1024-a77d-4efb-a77d-e1cf129f4f0d","Type":"ContainerStarted","Data":"884ef7370efbea6b7cd2538a0c8352e73e92c976bbdd33a1ed30531cf4157ca4"} Apr 16 18:26:18.111475 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.111454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" event={"ID":"65ab0e40-5e3f-492a-b381-ba30127be76e","Type":"ContainerStarted","Data":"4b8fe3fda95fd2146d7db9d125f714e9ceefb4386c695d60f276ca5d1656f552"} Apr 16 18:26:18.397240 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.397165 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl"] Apr 16 18:26:18.423167 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.423141 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl"] Apr 16 18:26:18.423305 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.423270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.425605 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.425585 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:26:18.425731 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.425641 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:26:18.425957 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.425936 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-t75bx\"" Apr 16 18:26:18.426083 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.426024 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:26:18.531773 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.531629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a3457da0-8454-4c29-b40e-3a730a80616f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl\" (UID: \"a3457da0-8454-4c29-b40e-3a730a80616f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.531773 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.531686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9lg\" (UniqueName: \"kubernetes.io/projected/a3457da0-8454-4c29-b40e-3a730a80616f-kube-api-access-xt9lg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl\" (UID: \"a3457da0-8454-4c29-b40e-3a730a80616f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.632113 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.632081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a3457da0-8454-4c29-b40e-3a730a80616f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl\" (UID: \"a3457da0-8454-4c29-b40e-3a730a80616f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.632285 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.632222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9lg\" (UniqueName: \"kubernetes.io/projected/a3457da0-8454-4c29-b40e-3a730a80616f-kube-api-access-xt9lg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl\" (UID: \"a3457da0-8454-4c29-b40e-3a730a80616f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.634907 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.634884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a3457da0-8454-4c29-b40e-3a730a80616f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl\" (UID: \"a3457da0-8454-4c29-b40e-3a730a80616f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.644330 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.644302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9lg\" (UniqueName: \"kubernetes.io/projected/a3457da0-8454-4c29-b40e-3a730a80616f-kube-api-access-xt9lg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl\" (UID: \"a3457da0-8454-4c29-b40e-3a730a80616f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.736463 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.736399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:18.886120 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:18.886072 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl"] Apr 16 18:26:18.890129 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:26:18.890098 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3457da0_8454_4c29_b40e_3a730a80616f.slice/crio-82f77e9160b51aebac9a5a79d804e8145907f75cf1f29afefc5c41777b798448 WatchSource:0}: Error finding container 82f77e9160b51aebac9a5a79d804e8145907f75cf1f29afefc5c41777b798448: Status 404 returned error can't find the container with id 82f77e9160b51aebac9a5a79d804e8145907f75cf1f29afefc5c41777b798448 Apr 16 18:26:19.116472 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:19.116393 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" event={"ID":"a3457da0-8454-4c29-b40e-3a730a80616f","Type":"ContainerStarted","Data":"82f77e9160b51aebac9a5a79d804e8145907f75cf1f29afefc5c41777b798448"} Apr 16 18:26:21.124938 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:21.124400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" event={"ID":"65ab0e40-5e3f-492a-b381-ba30127be76e","Type":"ContainerStarted","Data":"d6bdca0c259c77d3a2f2a48f7ee3087a0ac4876091f4ceb3660795f217e58098"} Apr 16 18:26:21.144750 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:21.144691 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76b7f47c59-smhs8" podStartSLOduration=1.5363327450000002 podStartE2EDuration="5.144671369s" podCreationTimestamp="2026-04-16 18:26:16 +0000 UTC" firstStartedPulling="2026-04-16 18:26:17.347826308 +0000 UTC m=+550.372869085" lastFinishedPulling="2026-04-16 18:26:20.95616492 +0000 UTC m=+553.981207709" observedRunningTime="2026-04-16 18:26:21.142395597 +0000 UTC m=+554.167438391" watchObservedRunningTime="2026-04-16 18:26:21.144671369 +0000 UTC m=+554.169714168" Apr 16 18:26:24.868817 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.868783 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lxkfn"] Apr 16 18:26:24.872075 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.872055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.874030 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.874007 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:26:24.874406 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.874242 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-bntgt\"" Apr 16 18:26:24.874406 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.874256 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:26:24.878790 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.878764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.878889 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.878810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdp2v\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-kube-api-access-mdp2v\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.878889 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.878851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-cabundle0\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.881375 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.881352 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lxkfn"] Apr 16 18:26:24.979836 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.979803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.980026 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.979850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdp2v\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-kube-api-access-mdp2v\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.980026 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.979891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-cabundle0\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.980149 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:24.980023 2572 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:26:24.980149 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:24.980045 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:26:24.980149 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:24.980057 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lxkfn: references non-existent secret key: ca.crt Apr 16 18:26:24.980149 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:24.980126 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates podName:ba7dd762-b0b6-4b5d-8656-805464e8a2f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:26:25.480106274 +0000 UTC m=+558.505149066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates") pod "keda-operator-ffbb595cb-lxkfn" (UID: "ba7dd762-b0b6-4b5d-8656-805464e8a2f4") : references non-existent secret key: ca.crt Apr 16 18:26:24.980634 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.980609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-cabundle0\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:24.991607 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:24.991585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdp2v\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-kube-api-access-mdp2v\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:25.142422 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.142329 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" event={"ID":"a75f1024-a77d-4efb-a77d-e1cf129f4f0d","Type":"ContainerStarted","Data":"0e8e05e7ca695c73f72101f0915a5cc1ce064c8edb05b3e6099a9a05e7343ae7"} Apr 16 18:26:25.142574 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.142508 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:25.143973 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.143937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" event={"ID":"a3457da0-8454-4c29-b40e-3a730a80616f","Type":"ContainerStarted","Data":"38658e242a276b309b83c891458ca492972c87b46df8dc703156b9b3bb3dbf63"} Apr 16 18:26:25.144135 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.144117 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:25.144556 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.144540 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" Apr 16 18:26:25.166930 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.166888 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64c9f7c49f-4j4bt" podStartSLOduration=2.198389783 podStartE2EDuration="9.16687672s" podCreationTimestamp="2026-04-16 18:26:16 +0000 UTC" firstStartedPulling="2026-04-16 18:26:17.347402178 +0000 UTC m=+550.372444955" lastFinishedPulling="2026-04-16 18:26:24.315889116 +0000 UTC m=+557.340931892" observedRunningTime="2026-04-16 18:26:25.165857999 +0000 UTC m=+558.190900796" watchObservedRunningTime="2026-04-16 18:26:25.16687672 +0000 UTC m=+558.191919514" Apr 16 18:26:25.208603 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.208567 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" podStartSLOduration=1.786820893 podStartE2EDuration="7.208554448s" podCreationTimestamp="2026-04-16 18:26:18 +0000 UTC" firstStartedPulling="2026-04-16 18:26:18.892591436 +0000 UTC m=+551.917634226" lastFinishedPulling="2026-04-16 18:26:24.314324994 +0000 UTC m=+557.339367781" observedRunningTime="2026-04-16 18:26:25.208120242 +0000 UTC m=+558.233163040" watchObservedRunningTime="2026-04-16 18:26:25.208554448 +0000 UTC m=+558.233597245" Apr 16 18:26:25.215839 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.215817 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k"] Apr 16 18:26:25.218907 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.218893 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.220923 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.220909 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:26:25.230008 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.229971 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k"] Apr 16 18:26:25.282386 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.282361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.282492 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.282401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.282539 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.282499 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq75k\" (UniqueName: \"kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-kube-api-access-tq75k\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.383738 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.383710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq75k\" (UniqueName: \"kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-kube-api-access-tq75k\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.383900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.383771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.383900 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.383850 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:26:25.383900 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.383860 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:26:25.383900 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.383879 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k: references non-existent secret key: tls.crt Apr 16 18:26:25.383900 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.383883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.384180 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.383929 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-certificates podName:e66fd353-c766-47f5-a1fc-e1cd6e0d49d7 nodeName:}" failed. No retries permitted until 2026-04-16 18:26:25.8839155 +0000 UTC m=+558.908958276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-certificates") pod "keda-metrics-apiserver-7c9f485588-d8t2k" (UID: "e66fd353-c766-47f5-a1fc-e1cd6e0d49d7") : references non-existent secret key: tls.crt Apr 16 18:26:25.384226 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.384217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.397317 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.397256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq75k\" (UniqueName: \"kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-kube-api-access-tq75k\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.401438 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.401419 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-wz244"] Apr 16 18:26:25.404701 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.404684 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:25.406888 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.406872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:26:25.414115 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.414091 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wz244"] Apr 16 18:26:25.484578 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.484548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5bg\" (UniqueName: \"kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-kube-api-access-4t5bg\") pod \"keda-admission-cf49989db-wz244\" (UID: \"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee\") " pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:25.484712 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.484594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-certificates\") pod \"keda-admission-cf49989db-wz244\" (UID: \"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee\") " pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:25.484712 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.484630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:25.484790 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.484763 2572 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:26:25.484790 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.484779 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:26:25.484790 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.484787 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lxkfn: references non-existent secret key: ca.crt Apr 16 18:26:25.484875 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.484832 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates podName:ba7dd762-b0b6-4b5d-8656-805464e8a2f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:26:26.484815854 +0000 UTC m=+559.509858639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates") pod "keda-operator-ffbb595cb-lxkfn" (UID: "ba7dd762-b0b6-4b5d-8656-805464e8a2f4") : references non-existent secret key: ca.crt Apr 16 18:26:25.585204 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.585180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5bg\" (UniqueName: \"kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-kube-api-access-4t5bg\") pod \"keda-admission-cf49989db-wz244\" (UID: \"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee\") " pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:25.585301 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.585214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-certificates\") pod \"keda-admission-cf49989db-wz244\" (UID: \"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee\") " pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:25.585408 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.585392 2572 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 18:26:25.585453 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.585421 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-wz244: secret "keda-admission-webhooks-certs" not found Apr 16 18:26:25.585496 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:25.585486 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-certificates podName:d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:26:26.085467075 +0000 UTC m=+559.110509854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-certificates") pod "keda-admission-cf49989db-wz244" (UID: "d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee") : secret "keda-admission-webhooks-certs" not found Apr 16 18:26:25.595354 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.595331 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5bg\" (UniqueName: \"kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-kube-api-access-4t5bg\") pod \"keda-admission-cf49989db-wz244\" (UID: \"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee\") " pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:25.887195 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.887158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:25.889592 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:25.889573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e66fd353-c766-47f5-a1fc-e1cd6e0d49d7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d8t2k\" (UID: \"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:26.088885 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.088855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-certificates\") pod \"keda-admission-cf49989db-wz244\" (UID: \"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee\") " pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:26.091151 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.091132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee-certificates\") pod \"keda-admission-cf49989db-wz244\" (UID: \"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee\") " pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:26.128998 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.128973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:26.257981 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.257950 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k"] Apr 16 18:26:26.261686 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:26:26.261657 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode66fd353_c766_47f5_a1fc_e1cd6e0d49d7.slice/crio-c7df1ae9023cd74102e4335ca8472e9636b687bada6c7f130a5aa28218244d6e WatchSource:0}: Error finding container c7df1ae9023cd74102e4335ca8472e9636b687bada6c7f130a5aa28218244d6e: Status 404 returned error can't find the container with id c7df1ae9023cd74102e4335ca8472e9636b687bada6c7f130a5aa28218244d6e Apr 16 18:26:26.314692 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.314666 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:26.435758 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.435695 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wz244"] Apr 16 18:26:26.438652 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:26:26.438628 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ec03ff_ba24_47d1_a3af_d4e25b20a7ee.slice/crio-ba6466e996db05be9b7dbfca9579333015c1c06e5779ee2ccf35c39a0e035493 WatchSource:0}: Error finding container ba6466e996db05be9b7dbfca9579333015c1c06e5779ee2ccf35c39a0e035493: Status 404 returned error can't find the container with id ba6466e996db05be9b7dbfca9579333015c1c06e5779ee2ccf35c39a0e035493 Apr 16 18:26:26.492647 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.492622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:26.494770 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.494753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba7dd762-b0b6-4b5d-8656-805464e8a2f4-certificates\") pod \"keda-operator-ffbb595cb-lxkfn\" (UID: \"ba7dd762-b0b6-4b5d-8656-805464e8a2f4\") " pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:26.683145 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.683121 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:26.800271 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:26.799659 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lxkfn"] Apr 16 18:26:26.802455 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:26:26.802426 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7dd762_b0b6_4b5d_8656_805464e8a2f4.slice/crio-ad6f5630d28e2ade12ef9e899096d12eddec349a23ba62249584e0103699ea49 WatchSource:0}: Error finding container ad6f5630d28e2ade12ef9e899096d12eddec349a23ba62249584e0103699ea49: Status 404 returned error can't find the container with id ad6f5630d28e2ade12ef9e899096d12eddec349a23ba62249584e0103699ea49 Apr 16 18:26:27.152914 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:27.152877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wz244" event={"ID":"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee","Type":"ContainerStarted","Data":"ba6466e996db05be9b7dbfca9579333015c1c06e5779ee2ccf35c39a0e035493"} Apr 16 18:26:27.154389 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:27.154361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" event={"ID":"ba7dd762-b0b6-4b5d-8656-805464e8a2f4","Type":"ContainerStarted","Data":"ad6f5630d28e2ade12ef9e899096d12eddec349a23ba62249584e0103699ea49"} Apr 16 18:26:27.155916 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:27.155885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" event={"ID":"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7","Type":"ContainerStarted","Data":"c7df1ae9023cd74102e4335ca8472e9636b687bada6c7f130a5aa28218244d6e"} Apr 16 18:26:28.160605 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:28.160518 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wz244" event={"ID":"d2ec03ff-ba24-47d1-a3af-d4e25b20a7ee","Type":"ContainerStarted","Data":"68466d25dc9b67fff05d8500c0b632157969df894731548a5de06bb8945af4ac"} Apr 16 18:26:28.160605 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:28.160596 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:28.178471 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:28.178375 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-wz244" podStartSLOduration=1.711510766 podStartE2EDuration="3.178357549s" podCreationTimestamp="2026-04-16 18:26:25 +0000 UTC" firstStartedPulling="2026-04-16 18:26:26.439910533 +0000 UTC m=+559.464953309" lastFinishedPulling="2026-04-16 18:26:27.906757306 +0000 UTC m=+560.931800092" observedRunningTime="2026-04-16 18:26:28.177417783 +0000 UTC m=+561.202460576" watchObservedRunningTime="2026-04-16 18:26:28.178357549 +0000 UTC m=+561.203400347" Apr 16 18:26:29.168343 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:29.168259 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" event={"ID":"e66fd353-c766-47f5-a1fc-e1cd6e0d49d7","Type":"ContainerStarted","Data":"dbace20a16265fc5c8288c5bc5561e59df51e4f05576dd90bdbd564fae3e7f0d"} Apr 16 18:26:29.168693 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:29.168427 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:29.186728 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:29.186684 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" podStartSLOduration=1.561741861 podStartE2EDuration="4.186671837s" podCreationTimestamp="2026-04-16 18:26:25 +0000 UTC" firstStartedPulling="2026-04-16 18:26:26.262927489 +0000 UTC m=+559.287970265" lastFinishedPulling="2026-04-16 18:26:28.887857452 +0000 UTC m=+561.912900241" observedRunningTime="2026-04-16 18:26:29.186313639 +0000 UTC m=+562.211356438" watchObservedRunningTime="2026-04-16 18:26:29.186671837 +0000 UTC m=+562.211714634" Apr 16 18:26:31.179452 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:31.179408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" event={"ID":"ba7dd762-b0b6-4b5d-8656-805464e8a2f4","Type":"ContainerStarted","Data":"6c6002e04ea19e73398eb3b3216d8b12bde7609a2270a11566d9449aff2b6f8d"} Apr 16 18:26:31.179941 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:31.179522 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:26:31.205278 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:31.205225 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" podStartSLOduration=3.491553431 podStartE2EDuration="7.205210919s" podCreationTimestamp="2026-04-16 18:26:24 +0000 UTC" firstStartedPulling="2026-04-16 18:26:26.803749704 +0000 UTC m=+559.828792480" lastFinishedPulling="2026-04-16 18:26:30.517407191 +0000 UTC m=+563.542449968" observedRunningTime="2026-04-16 18:26:31.204172991 +0000 UTC m=+564.229215811" watchObservedRunningTime="2026-04-16 18:26:31.205210919 +0000 UTC m=+564.230253781" Apr 16 18:26:32.255918 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.255863 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66489c6f5f-sjrkg" podUID="648f464e-2822-4169-8516-1f3382ca3ba0" containerName="console" containerID="cri-o://7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa" gracePeriod=15 Apr 16 18:26:32.481902 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.481883 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66489c6f5f-sjrkg_648f464e-2822-4169-8516-1f3382ca3ba0/console/0.log" Apr 16 18:26:32.482024 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.481938 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:26:32.542859 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.542784 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-service-ca\") pod \"648f464e-2822-4169-8516-1f3382ca3ba0\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " Apr 16 18:26:32.542859 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.542826 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-trusted-ca-bundle\") pod \"648f464e-2822-4169-8516-1f3382ca3ba0\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " Apr 16 18:26:32.543095 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.542874 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-oauth-config\") pod \"648f464e-2822-4169-8516-1f3382ca3ba0\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " Apr 16 18:26:32.543095 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.542904 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-oauth-serving-cert\") pod \"648f464e-2822-4169-8516-1f3382ca3ba0\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " Apr 16 18:26:32.543095 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.542941 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-console-config\") pod \"648f464e-2822-4169-8516-1f3382ca3ba0\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " Apr 16 18:26:32.543095 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.542970 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnj29\" (UniqueName: \"kubernetes.io/projected/648f464e-2822-4169-8516-1f3382ca3ba0-kube-api-access-hnj29\") pod \"648f464e-2822-4169-8516-1f3382ca3ba0\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " Apr 16 18:26:32.543095 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.543030 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-serving-cert\") pod \"648f464e-2822-4169-8516-1f3382ca3ba0\" (UID: \"648f464e-2822-4169-8516-1f3382ca3ba0\") " Apr 16 18:26:32.543339 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.543314 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-service-ca" (OuterVolumeSpecName: "service-ca") pod "648f464e-2822-4169-8516-1f3382ca3ba0" (UID: "648f464e-2822-4169-8516-1f3382ca3ba0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:32.543384 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.543331 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "648f464e-2822-4169-8516-1f3382ca3ba0" (UID: "648f464e-2822-4169-8516-1f3382ca3ba0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:32.543424 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.543400 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "648f464e-2822-4169-8516-1f3382ca3ba0" (UID: "648f464e-2822-4169-8516-1f3382ca3ba0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:32.543535 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.543512 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-console-config" (OuterVolumeSpecName: "console-config") pod "648f464e-2822-4169-8516-1f3382ca3ba0" (UID: "648f464e-2822-4169-8516-1f3382ca3ba0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:32.545069 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.545044 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "648f464e-2822-4169-8516-1f3382ca3ba0" (UID: "648f464e-2822-4169-8516-1f3382ca3ba0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:32.545172 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.545068 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/648f464e-2822-4169-8516-1f3382ca3ba0-kube-api-access-hnj29" (OuterVolumeSpecName: "kube-api-access-hnj29") pod "648f464e-2822-4169-8516-1f3382ca3ba0" (UID: "648f464e-2822-4169-8516-1f3382ca3ba0"). InnerVolumeSpecName "kube-api-access-hnj29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:32.545172 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.545144 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "648f464e-2822-4169-8516-1f3382ca3ba0" (UID: "648f464e-2822-4169-8516-1f3382ca3ba0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:32.644270 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.644246 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-service-ca\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:32.644270 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.644267 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-trusted-ca-bundle\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:32.644409 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.644276 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-oauth-config\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:32.644409 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.644285 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-oauth-serving-cert\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:32.644409 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.644293 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/648f464e-2822-4169-8516-1f3382ca3ba0-console-config\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:32.644409 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.644302 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnj29\" (UniqueName: \"kubernetes.io/projected/648f464e-2822-4169-8516-1f3382ca3ba0-kube-api-access-hnj29\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:32.644409 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:32.644310 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/648f464e-2822-4169-8516-1f3382ca3ba0-console-serving-cert\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:26:33.187188 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.187163 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66489c6f5f-sjrkg_648f464e-2822-4169-8516-1f3382ca3ba0/console/0.log" Apr 16 18:26:33.187349 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.187199 2572 generic.go:358] "Generic (PLEG): container finished" podID="648f464e-2822-4169-8516-1f3382ca3ba0" containerID="7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa" exitCode=2 Apr 16 18:26:33.187349 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.187263 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66489c6f5f-sjrkg" Apr 16 18:26:33.187349 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.187265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66489c6f5f-sjrkg" event={"ID":"648f464e-2822-4169-8516-1f3382ca3ba0","Type":"ContainerDied","Data":"7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa"} Apr 16 18:26:33.187442 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.187353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66489c6f5f-sjrkg" event={"ID":"648f464e-2822-4169-8516-1f3382ca3ba0","Type":"ContainerDied","Data":"012cbda0c6b189df0e72b048a25f65f4640faa37c223db495de0c397b041c71b"} Apr 16 18:26:33.187442 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.187367 2572 scope.go:117] "RemoveContainer" containerID="7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa" Apr 16 18:26:33.195528 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.195511 2572 scope.go:117] "RemoveContainer" containerID="7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa" Apr 16 18:26:33.195771 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:26:33.195754 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa\": container with ID starting with 7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa not found: ID does not exist" containerID="7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa" Apr 16 18:26:33.195824 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.195777 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa"} err="failed to get container status \"7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa\": rpc error: code = NotFound desc = could not find container \"7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa\": container with ID starting with 7b0a96b28613814835d247d3e3ae1d38fdc47f7132b294aac4a7d50e956cc9fa not found: ID does not exist" Apr 16 18:26:33.209773 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.209693 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66489c6f5f-sjrkg"] Apr 16 18:26:33.214698 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.214676 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66489c6f5f-sjrkg"] Apr 16 18:26:33.582303 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:33.582270 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="648f464e-2822-4169-8516-1f3382ca3ba0" path="/var/lib/kubelet/pods/648f464e-2822-4169-8516-1f3382ca3ba0/volumes" Apr 16 18:26:40.175840 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:40.175811 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d8t2k" Apr 16 18:26:46.149966 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:46.149934 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-g5rbl" Apr 16 18:26:49.171009 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:49.170946 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-wz244" Apr 16 18:26:52.184400 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:26:52.184373 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-lxkfn" Apr 16 18:27:07.462657 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:07.462393 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:27:07.464656 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:07.464638 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:27:30.986789 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.986704 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7668d57578-7hvnj"] Apr 16 18:27:30.987311 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.987173 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="648f464e-2822-4169-8516-1f3382ca3ba0" containerName="console" Apr 16 18:27:30.987311 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.987194 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="648f464e-2822-4169-8516-1f3382ca3ba0" containerName="console" Apr 16 18:27:30.987311 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.987275 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="648f464e-2822-4169-8516-1f3382ca3ba0" containerName="console" Apr 16 18:27:30.989868 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.989847 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:30.993385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.993364 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-wssvl\"" Apr 16 18:27:30.993385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.993375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:27:30.993518 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.993364 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:27:30.993518 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:30.993362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:27:31.003372 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.003347 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-7hvnj"] Apr 16 18:27:31.163072 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.163041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc56066a-cf31-45ca-9498-b84709cd2bff-cert\") pod \"kserve-controller-manager-7668d57578-7hvnj\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:31.163189 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.163139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wvt\" (UniqueName: \"kubernetes.io/projected/cc56066a-cf31-45ca-9498-b84709cd2bff-kube-api-access-92wvt\") pod \"kserve-controller-manager-7668d57578-7hvnj\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:31.263828 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.263761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92wvt\" (UniqueName: \"kubernetes.io/projected/cc56066a-cf31-45ca-9498-b84709cd2bff-kube-api-access-92wvt\") pod \"kserve-controller-manager-7668d57578-7hvnj\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:31.263828 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.263799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc56066a-cf31-45ca-9498-b84709cd2bff-cert\") pod \"kserve-controller-manager-7668d57578-7hvnj\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:31.265968 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.265947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc56066a-cf31-45ca-9498-b84709cd2bff-cert\") pod \"kserve-controller-manager-7668d57578-7hvnj\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:31.273528 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.273508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wvt\" (UniqueName: \"kubernetes.io/projected/cc56066a-cf31-45ca-9498-b84709cd2bff-kube-api-access-92wvt\") pod \"kserve-controller-manager-7668d57578-7hvnj\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:31.299465 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.299440 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:31.417378 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:31.417356 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-7hvnj"] Apr 16 18:27:31.419838 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:27:31.419812 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc56066a_cf31_45ca_9498_b84709cd2bff.slice/crio-d0a0080cee40c5da0344640083772fffb98022d91187b83056cbaeabaa02e29b WatchSource:0}: Error finding container d0a0080cee40c5da0344640083772fffb98022d91187b83056cbaeabaa02e29b: Status 404 returned error can't find the container with id d0a0080cee40c5da0344640083772fffb98022d91187b83056cbaeabaa02e29b Apr 16 18:27:32.366506 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:32.366471 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" event={"ID":"cc56066a-cf31-45ca-9498-b84709cd2bff","Type":"ContainerStarted","Data":"d0a0080cee40c5da0344640083772fffb98022d91187b83056cbaeabaa02e29b"} Apr 16 18:27:34.374871 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:34.374842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" event={"ID":"cc56066a-cf31-45ca-9498-b84709cd2bff","Type":"ContainerStarted","Data":"1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034"} Apr 16 18:27:34.375223 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:34.374979 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:27:34.392705 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:27:34.392665 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" podStartSLOduration=1.990917664 podStartE2EDuration="4.392651484s" podCreationTimestamp="2026-04-16 18:27:30 +0000 UTC" firstStartedPulling="2026-04-16 18:27:31.42102001 +0000 UTC m=+624.446062789" lastFinishedPulling="2026-04-16 18:27:33.822753831 +0000 UTC m=+626.847796609" observedRunningTime="2026-04-16 18:27:34.391307026 +0000 UTC m=+627.416349847" watchObservedRunningTime="2026-04-16 18:27:34.392651484 +0000 UTC m=+627.417694348" Apr 16 18:28:05.383568 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:05.383541 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:28:07.609352 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.609323 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-7hvnj"] Apr 16 18:28:07.609763 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.609528 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" podUID="cc56066a-cf31-45ca-9498-b84709cd2bff" containerName="manager" containerID="cri-o://1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034" gracePeriod=10 Apr 16 18:28:07.645863 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.641372 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7668d57578-b2x86"] Apr 16 18:28:07.646171 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.646147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:07.656538 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.656512 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-b2x86"] Apr 16 18:28:07.738625 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.738595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41acd827-8fd4-405c-bd27-e46042b8ded8-cert\") pod \"kserve-controller-manager-7668d57578-b2x86\" (UID: \"41acd827-8fd4-405c-bd27-e46042b8ded8\") " pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:07.738749 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.738669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4fq\" (UniqueName: \"kubernetes.io/projected/41acd827-8fd4-405c-bd27-e46042b8ded8-kube-api-access-4n4fq\") pod \"kserve-controller-manager-7668d57578-b2x86\" (UID: \"41acd827-8fd4-405c-bd27-e46042b8ded8\") " pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:07.836008 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.835960 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:28:07.839274 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.839256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41acd827-8fd4-405c-bd27-e46042b8ded8-cert\") pod \"kserve-controller-manager-7668d57578-b2x86\" (UID: \"41acd827-8fd4-405c-bd27-e46042b8ded8\") " pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:07.839334 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.839307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4fq\" (UniqueName: \"kubernetes.io/projected/41acd827-8fd4-405c-bd27-e46042b8ded8-kube-api-access-4n4fq\") pod \"kserve-controller-manager-7668d57578-b2x86\" (UID: \"41acd827-8fd4-405c-bd27-e46042b8ded8\") " pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:07.841501 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.841478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41acd827-8fd4-405c-bd27-e46042b8ded8-cert\") pod \"kserve-controller-manager-7668d57578-b2x86\" (UID: \"41acd827-8fd4-405c-bd27-e46042b8ded8\") " pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:07.848394 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.848374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4fq\" (UniqueName: \"kubernetes.io/projected/41acd827-8fd4-405c-bd27-e46042b8ded8-kube-api-access-4n4fq\") pod \"kserve-controller-manager-7668d57578-b2x86\" (UID: \"41acd827-8fd4-405c-bd27-e46042b8ded8\") " pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:07.939752 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.939683 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc56066a-cf31-45ca-9498-b84709cd2bff-cert\") pod \"cc56066a-cf31-45ca-9498-b84709cd2bff\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " Apr 16 18:28:07.939752 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.939739 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92wvt\" (UniqueName: \"kubernetes.io/projected/cc56066a-cf31-45ca-9498-b84709cd2bff-kube-api-access-92wvt\") pod \"cc56066a-cf31-45ca-9498-b84709cd2bff\" (UID: \"cc56066a-cf31-45ca-9498-b84709cd2bff\") " Apr 16 18:28:07.941747 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.941721 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc56066a-cf31-45ca-9498-b84709cd2bff-kube-api-access-92wvt" (OuterVolumeSpecName: "kube-api-access-92wvt") pod "cc56066a-cf31-45ca-9498-b84709cd2bff" (UID: "cc56066a-cf31-45ca-9498-b84709cd2bff"). InnerVolumeSpecName "kube-api-access-92wvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:28:07.941865 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.941769 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc56066a-cf31-45ca-9498-b84709cd2bff-cert" (OuterVolumeSpecName: "cert") pod "cc56066a-cf31-45ca-9498-b84709cd2bff" (UID: "cc56066a-cf31-45ca-9498-b84709cd2bff"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:28:07.981453 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:07.981419 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:08.041002 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.040966 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc56066a-cf31-45ca-9498-b84709cd2bff-cert\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:28:08.041165 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.041012 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92wvt\" (UniqueName: \"kubernetes.io/projected/cc56066a-cf31-45ca-9498-b84709cd2bff-kube-api-access-92wvt\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:28:08.093185 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.093158 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-b2x86"] Apr 16 18:28:08.095594 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:28:08.095567 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41acd827_8fd4_405c_bd27_e46042b8ded8.slice/crio-5258538c00935b19406f5f78e30044157bd196155a34a8c966098d1cf0954983 WatchSource:0}: Error finding container 5258538c00935b19406f5f78e30044157bd196155a34a8c966098d1cf0954983: Status 404 returned error can't find the container with id 5258538c00935b19406f5f78e30044157bd196155a34a8c966098d1cf0954983 Apr 16 18:28:08.483303 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.483276 2572 generic.go:358] "Generic (PLEG): container finished" podID="cc56066a-cf31-45ca-9498-b84709cd2bff" containerID="1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034" exitCode=0 Apr 16 18:28:08.483420 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.483337 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" Apr 16 18:28:08.483420 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.483342 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" event={"ID":"cc56066a-cf31-45ca-9498-b84709cd2bff","Type":"ContainerDied","Data":"1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034"} Apr 16 18:28:08.483533 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.483429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-7hvnj" event={"ID":"cc56066a-cf31-45ca-9498-b84709cd2bff","Type":"ContainerDied","Data":"d0a0080cee40c5da0344640083772fffb98022d91187b83056cbaeabaa02e29b"} Apr 16 18:28:08.483533 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.483451 2572 scope.go:117] "RemoveContainer" containerID="1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034" Apr 16 18:28:08.484575 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.484550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-b2x86" event={"ID":"41acd827-8fd4-405c-bd27-e46042b8ded8","Type":"ContainerStarted","Data":"5258538c00935b19406f5f78e30044157bd196155a34a8c966098d1cf0954983"} Apr 16 18:28:08.493881 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.493866 2572 scope.go:117] "RemoveContainer" containerID="1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034" Apr 16 18:28:08.494163 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:28:08.494144 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034\": container with ID starting with 1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034 not found: ID does not exist" containerID="1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034" Apr 16 18:28:08.494247 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.494171 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034"} err="failed to get container status \"1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034\": rpc error: code = NotFound desc = could not find container \"1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034\": container with ID starting with 1d3572cae0ceea08b9c9e2cc2236e416afce1acd2210188486ec5dd70db85034 not found: ID does not exist" Apr 16 18:28:08.523117 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.523097 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-7hvnj"] Apr 16 18:28:08.528085 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:08.528066 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-7hvnj"] Apr 16 18:28:09.489055 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:09.489024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-b2x86" event={"ID":"41acd827-8fd4-405c-bd27-e46042b8ded8","Type":"ContainerStarted","Data":"d1c4a1400c9134d1a98ac6e73cb5cbde8ad54bf4e6fc131803bcd328c45c1713"} Apr 16 18:28:09.489429 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:09.489129 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:28:09.508786 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:09.508742 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7668d57578-b2x86" podStartSLOduration=2.132310369 podStartE2EDuration="2.508732261s" podCreationTimestamp="2026-04-16 18:28:07 +0000 UTC" firstStartedPulling="2026-04-16 18:28:08.096980965 +0000 UTC m=+661.122023741" lastFinishedPulling="2026-04-16 18:28:08.473402857 +0000 UTC m=+661.498445633" observedRunningTime="2026-04-16 18:28:09.507646045 +0000 UTC m=+662.532688840" watchObservedRunningTime="2026-04-16 18:28:09.508732261 +0000 UTC m=+662.533775058" Apr 16 18:28:09.582181 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:09.582155 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc56066a-cf31-45ca-9498-b84709cd2bff" path="/var/lib/kubelet/pods/cc56066a-cf31-45ca-9498-b84709cd2bff/volumes" Apr 16 18:28:40.497432 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:28:40.497399 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7668d57578-b2x86" Apr 16 18:29:08.269659 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.269626 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5"] Apr 16 18:29:08.274592 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.274560 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc56066a-cf31-45ca-9498-b84709cd2bff" containerName="manager" Apr 16 18:29:08.274725 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.274705 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc56066a-cf31-45ca-9498-b84709cd2bff" containerName="manager" Apr 16 18:29:08.274910 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.274899 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc56066a-cf31-45ca-9498-b84709cd2bff" containerName="manager" Apr 16 18:29:08.278062 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.278040 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.280264 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.280241 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bqs8m\"" Apr 16 18:29:08.280264 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.280241 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:29:08.281001 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.280968 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5"] Apr 16 18:29:08.392250 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.392229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487dee22-8e94-433e-96fc-4725e74e78bf-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-8zqx5\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.392342 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.392260 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdllz\" (UniqueName: \"kubernetes.io/projected/487dee22-8e94-433e-96fc-4725e74e78bf-kube-api-access-kdllz\") pod \"seaweedfs-tls-custom-ddd4dbfd-8zqx5\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.493213 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.493181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487dee22-8e94-433e-96fc-4725e74e78bf-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-8zqx5\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.493366 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.493228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdllz\" (UniqueName: \"kubernetes.io/projected/487dee22-8e94-433e-96fc-4725e74e78bf-kube-api-access-kdllz\") pod \"seaweedfs-tls-custom-ddd4dbfd-8zqx5\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.493531 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.493509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487dee22-8e94-433e-96fc-4725e74e78bf-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-8zqx5\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.502094 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.502074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdllz\" (UniqueName: \"kubernetes.io/projected/487dee22-8e94-433e-96fc-4725e74e78bf-kube-api-access-kdllz\") pod \"seaweedfs-tls-custom-ddd4dbfd-8zqx5\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.588545 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.588491 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:08.702114 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:08.702093 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5"] Apr 16 18:29:08.704437 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:29:08.704411 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487dee22_8e94_433e_96fc_4725e74e78bf.slice/crio-468e4b29ba5c42cdf10f071da5786bada09174dd701eb2278e3e0fbabc6b70ad WatchSource:0}: Error finding container 468e4b29ba5c42cdf10f071da5786bada09174dd701eb2278e3e0fbabc6b70ad: Status 404 returned error can't find the container with id 468e4b29ba5c42cdf10f071da5786bada09174dd701eb2278e3e0fbabc6b70ad Apr 16 18:29:09.669427 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:09.669396 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" event={"ID":"487dee22-8e94-433e-96fc-4725e74e78bf","Type":"ContainerStarted","Data":"468e4b29ba5c42cdf10f071da5786bada09174dd701eb2278e3e0fbabc6b70ad"} Apr 16 18:29:11.677190 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:11.677158 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" event={"ID":"487dee22-8e94-433e-96fc-4725e74e78bf","Type":"ContainerStarted","Data":"a17e0158bbfc9ca827a6a1b02377c6c87195d2c7ea86a66282728afc3f9998b8"} Apr 16 18:29:11.697697 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:11.697652 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" podStartSLOduration=1.308429518 podStartE2EDuration="3.6976377s" podCreationTimestamp="2026-04-16 18:29:08 +0000 UTC" firstStartedPulling="2026-04-16 18:29:08.705679732 +0000 UTC m=+721.730722508" lastFinishedPulling="2026-04-16 18:29:11.094887914 +0000 UTC m=+724.119930690" observedRunningTime="2026-04-16 18:29:11.696602981 +0000 UTC m=+724.721645778" watchObservedRunningTime="2026-04-16 18:29:11.6976377 +0000 UTC m=+724.722680498" Apr 16 18:29:12.939286 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:12.939254 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5"] Apr 16 18:29:13.685601 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:13.685563 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" podUID="487dee22-8e94-433e-96fc-4725e74e78bf" containerName="seaweedfs-tls-custom" containerID="cri-o://a17e0158bbfc9ca827a6a1b02377c6c87195d2c7ea86a66282728afc3f9998b8" gracePeriod=30 Apr 16 18:29:41.773066 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:41.773035 2572 generic.go:358] "Generic (PLEG): container finished" podID="487dee22-8e94-433e-96fc-4725e74e78bf" containerID="a17e0158bbfc9ca827a6a1b02377c6c87195d2c7ea86a66282728afc3f9998b8" exitCode=0 Apr 16 18:29:41.773431 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:41.773096 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" event={"ID":"487dee22-8e94-433e-96fc-4725e74e78bf","Type":"ContainerDied","Data":"a17e0158bbfc9ca827a6a1b02377c6c87195d2c7ea86a66282728afc3f9998b8"} Apr 16 18:29:41.819927 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:41.819905 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:41.957566 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:41.957478 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdllz\" (UniqueName: \"kubernetes.io/projected/487dee22-8e94-433e-96fc-4725e74e78bf-kube-api-access-kdllz\") pod \"487dee22-8e94-433e-96fc-4725e74e78bf\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " Apr 16 18:29:41.957566 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:41.957557 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487dee22-8e94-433e-96fc-4725e74e78bf-data\") pod \"487dee22-8e94-433e-96fc-4725e74e78bf\" (UID: \"487dee22-8e94-433e-96fc-4725e74e78bf\") " Apr 16 18:29:41.958874 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:41.958842 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487dee22-8e94-433e-96fc-4725e74e78bf-data" (OuterVolumeSpecName: "data") pod "487dee22-8e94-433e-96fc-4725e74e78bf" (UID: "487dee22-8e94-433e-96fc-4725e74e78bf"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:29:41.959519 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:41.959497 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487dee22-8e94-433e-96fc-4725e74e78bf-kube-api-access-kdllz" (OuterVolumeSpecName: "kube-api-access-kdllz") pod "487dee22-8e94-433e-96fc-4725e74e78bf" (UID: "487dee22-8e94-433e-96fc-4725e74e78bf"). InnerVolumeSpecName "kube-api-access-kdllz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:29:42.059057 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:42.059027 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdllz\" (UniqueName: \"kubernetes.io/projected/487dee22-8e94-433e-96fc-4725e74e78bf-kube-api-access-kdllz\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:29:42.059057 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:42.059054 2572 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487dee22-8e94-433e-96fc-4725e74e78bf-data\") on node \"ip-10-0-138-175.ec2.internal\" DevicePath \"\"" Apr 16 18:29:42.777341 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:42.777313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" event={"ID":"487dee22-8e94-433e-96fc-4725e74e78bf","Type":"ContainerDied","Data":"468e4b29ba5c42cdf10f071da5786bada09174dd701eb2278e3e0fbabc6b70ad"} Apr 16 18:29:42.777784 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:42.777351 2572 scope.go:117] "RemoveContainer" containerID="a17e0158bbfc9ca827a6a1b02377c6c87195d2c7ea86a66282728afc3f9998b8" Apr 16 18:29:42.777784 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:42.777352 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5" Apr 16 18:29:42.798971 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:42.798947 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5"] Apr 16 18:29:42.802784 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:42.802764 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8zqx5"] Apr 16 18:29:43.583214 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:43.583183 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487dee22-8e94-433e-96fc-4725e74e78bf" path="/var/lib/kubelet/pods/487dee22-8e94-433e-96fc-4725e74e78bf/volumes" Apr 16 18:29:54.253500 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.253472 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk"] Apr 16 18:29:54.253856 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.253759 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="487dee22-8e94-433e-96fc-4725e74e78bf" containerName="seaweedfs-tls-custom" Apr 16 18:29:54.253856 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.253769 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="487dee22-8e94-433e-96fc-4725e74e78bf" containerName="seaweedfs-tls-custom" Apr 16 18:29:54.253856 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.253820 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="487dee22-8e94-433e-96fc-4725e74e78bf" containerName="seaweedfs-tls-custom" Apr 16 18:29:54.256700 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.256681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.259291 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.259272 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 18:29:54.259385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.259276 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 18:29:54.259385 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.259318 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bqs8m\"" Apr 16 18:29:54.264999 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.264968 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk"] Apr 16 18:29:54.347809 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.347779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.347935 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.347817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mwmc\" (UniqueName: \"kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-kube-api-access-7mwmc\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.347935 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.347904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2916cac8-77b6-4f16-bb6b-e15da6e85c93-data\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.448956 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.448925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.449120 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.448963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mwmc\" (UniqueName: \"kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-kube-api-access-7mwmc\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.449120 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.449039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2916cac8-77b6-4f16-bb6b-e15da6e85c93-data\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.449120 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:29:54.449080 2572 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 16 18:29:54.449120 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:29:54.449096 2572 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk: secret "seaweedfs-tls-serving" not found Apr 16 18:29:54.449350 ip-10-0-138-175 kubenswrapper[2572]: E0416 18:29:54.449149 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-seaweedfs-tls-serving podName:2916cac8-77b6-4f16-bb6b-e15da6e85c93 nodeName:}" failed. No retries permitted until 2026-04-16 18:29:54.949132415 +0000 UTC m=+767.974175192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-bkjzk" (UID: "2916cac8-77b6-4f16-bb6b-e15da6e85c93") : secret "seaweedfs-tls-serving" not found Apr 16 18:29:54.449442 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.449424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2916cac8-77b6-4f16-bb6b-e15da6e85c93-data\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.457956 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.457936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mwmc\" (UniqueName: \"kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-kube-api-access-7mwmc\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.952972 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.952922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:54.955197 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:54.955175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/2916cac8-77b6-4f16-bb6b-e15da6e85c93-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-bkjzk\" (UID: \"2916cac8-77b6-4f16-bb6b-e15da6e85c93\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:55.166208 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:55.166176 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" Apr 16 18:29:55.281240 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:55.281212 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk"] Apr 16 18:29:55.283131 ip-10-0-138-175 kubenswrapper[2572]: W0416 18:29:55.283105 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2916cac8_77b6_4f16_bb6b_e15da6e85c93.slice/crio-f66d576e7cba933e00627f36c12537688754723040626a0ad10f291951ee32ba WatchSource:0}: Error finding container f66d576e7cba933e00627f36c12537688754723040626a0ad10f291951ee32ba: Status 404 returned error can't find the container with id f66d576e7cba933e00627f36c12537688754723040626a0ad10f291951ee32ba Apr 16 18:29:55.818101 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:55.818015 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" event={"ID":"2916cac8-77b6-4f16-bb6b-e15da6e85c93","Type":"ContainerStarted","Data":"3f3209d96753215560e5d8dce81ab636a1d5767e8137ecc7dccb2a9e5d1ef500"} Apr 16 18:29:55.818101 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:55.818052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" event={"ID":"2916cac8-77b6-4f16-bb6b-e15da6e85c93","Type":"ContainerStarted","Data":"f66d576e7cba933e00627f36c12537688754723040626a0ad10f291951ee32ba"} Apr 16 18:29:55.834565 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:29:55.834519 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-bkjzk" podStartSLOduration=1.5666567329999999 podStartE2EDuration="1.83450628s" podCreationTimestamp="2026-04-16 18:29:54 +0000 UTC" firstStartedPulling="2026-04-16 18:29:55.284378926 +0000 UTC m=+768.309421705" lastFinishedPulling="2026-04-16 18:29:55.552228472 +0000 UTC m=+768.577271252" observedRunningTime="2026-04-16 18:29:55.833648855 +0000 UTC m=+768.858691677" watchObservedRunningTime="2026-04-16 18:29:55.83450628 +0000 UTC m=+768.859549078" Apr 16 18:32:07.482071 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:32:07.482046 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:32:07.485718 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:32:07.485696 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:37:07.502035 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:37:07.502010 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:37:07.509971 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:37:07.509950 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:42:07.521133 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:42:07.521107 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:42:07.530258 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:42:07.530239 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:47:07.540715 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:47:07.540680 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:47:07.550345 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:47:07.550321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:52:07.562328 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:52:07.562298 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:52:07.572465 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:52:07.572446 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:57:07.581844 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:57:07.581817 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 18:57:07.592796 ip-10-0-138-175 kubenswrapper[2572]: I0416 18:57:07.592775 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:02:07.605336 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:02:07.605309 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:02:07.614655 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:02:07.614634 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:07:07.627822 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:07:07.627792 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:07:07.636960 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:07:07.636940 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:12:07.646453 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:12:07.646363 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:12:07.656505 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:12:07.656486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:17:07.665978 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:17:07.665874 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:17:07.676367 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:17:07.676348 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:22:07.694173 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:22:07.694073 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:22:07.704879 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:22:07.704856 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:24:27.090250 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:27.090184 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wh7d5_8cd9794f-105a-4fbf-ae0d-7f399cb33595/global-pull-secret-syncer/0.log" Apr 16 19:24:27.290732 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:27.290705 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vwvb6_612b22f4-6ba0-49fb-8b66-2cff81a247be/konnectivity-agent/0.log" Apr 16 19:24:27.345768 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:27.345698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-175.ec2.internal_f1b0f110a831d4055d6fd2016cd3ab02/haproxy/0.log" Apr 16 19:24:30.056525 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.056498 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9daafdb0-a23f-458e-b3bf-c2f20a0fbd12/alertmanager/0.log" Apr 16 19:24:30.127423 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.127394 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9daafdb0-a23f-458e-b3bf-c2f20a0fbd12/config-reloader/0.log" Apr 16 19:24:30.177118 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.177094 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9daafdb0-a23f-458e-b3bf-c2f20a0fbd12/kube-rbac-proxy-web/0.log" Apr 16 19:24:30.231170 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.231146 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9daafdb0-a23f-458e-b3bf-c2f20a0fbd12/kube-rbac-proxy/0.log" Apr 16 19:24:30.292735 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.292711 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9daafdb0-a23f-458e-b3bf-c2f20a0fbd12/kube-rbac-proxy-metric/0.log" Apr 16 19:24:30.332743 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.332681 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9daafdb0-a23f-458e-b3bf-c2f20a0fbd12/prom-label-proxy/0.log" Apr 16 19:24:30.369016 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.368979 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9daafdb0-a23f-458e-b3bf-c2f20a0fbd12/init-config-reloader/0.log" Apr 16 19:24:30.727099 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.727067 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jk75h_7e446bf9-2453-42ef-af7d-39fb7389ba58/node-exporter/0.log" Apr 16 19:24:30.762237 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.762211 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jk75h_7e446bf9-2453-42ef-af7d-39fb7389ba58/kube-rbac-proxy/0.log" Apr 16 19:24:30.792997 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:30.792974 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jk75h_7e446bf9-2453-42ef-af7d-39fb7389ba58/init-textfile/0.log" Apr 16 19:24:33.805893 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:33.805864 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-rzb5c_85a99918-c029-45a6-bd5c-b9ed19071944/download-server/0.log" Apr 16 19:24:34.461107 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.461075 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277"] Apr 16 19:24:34.464189 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.464167 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.466435 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.466416 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7m7f8\"/\"kube-root-ca.crt\"" Apr 16 19:24:34.467226 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.467210 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7m7f8\"/\"openshift-service-ca.crt\"" Apr 16 19:24:34.467335 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.467209 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7m7f8\"/\"default-dockercfg-jlsz5\"" Apr 16 19:24:34.473504 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.473485 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277"] Apr 16 19:24:34.616284 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.616252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gcb\" (UniqueName: \"kubernetes.io/projected/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-kube-api-access-k9gcb\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.616456 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.616334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-podres\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.616456 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.616400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-lib-modules\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.616456 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.616430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-sys\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.616456 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.616452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-proc\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717112 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717037 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-lib-modules\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717112 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-sys\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717112 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-proc\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717371 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gcb\" (UniqueName: \"kubernetes.io/projected/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-kube-api-access-k9gcb\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717371 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-sys\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717371 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-proc\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717371 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-lib-modules\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717371 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-podres\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.717538 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.717417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-podres\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.727132 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.727116 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gcb\" (UniqueName: \"kubernetes.io/projected/181e598b-c5f5-48eb-b59f-fc4ca1cf07a7-kube-api-access-k9gcb\") pod \"perf-node-gather-daemonset-js277\" (UID: \"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.774259 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.774224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:34.903751 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.903719 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277"] Apr 16 19:24:34.907200 ip-10-0-138-175 kubenswrapper[2572]: W0416 19:24:34.907170 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod181e598b_c5f5_48eb_b59f_fc4ca1cf07a7.slice/crio-768c431639fd71df59932ad19e9816859f6bb5a2f9e21dfcc4619d6700757f6f WatchSource:0}: Error finding container 768c431639fd71df59932ad19e9816859f6bb5a2f9e21dfcc4619d6700757f6f: Status 404 returned error can't find the container with id 768c431639fd71df59932ad19e9816859f6bb5a2f9e21dfcc4619d6700757f6f Apr 16 19:24:34.908888 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:34.908874 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:24:35.090225 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.090196 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dj5sz_528ce5b9-64df-485a-8269-fb5ba8dc8ba5/dns/0.log" Apr 16 19:24:35.118672 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.118646 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dj5sz_528ce5b9-64df-485a-8269-fb5ba8dc8ba5/kube-rbac-proxy/0.log" Apr 16 19:24:35.276783 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.276760 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xxr9z_dc04a45e-8316-4c0a-86a5-12c986bd0756/dns-node-resolver/0.log" Apr 16 19:24:35.622936 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.622904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" event={"ID":"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7","Type":"ContainerStarted","Data":"c597918795b5284899b52ba22bbeeabaf06207f34118073ab216101e16facf65"} Apr 16 19:24:35.622936 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.622936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" event={"ID":"181e598b-c5f5-48eb-b59f-fc4ca1cf07a7","Type":"ContainerStarted","Data":"768c431639fd71df59932ad19e9816859f6bb5a2f9e21dfcc4619d6700757f6f"} Apr 16 19:24:35.623153 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.622960 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:35.655690 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.655634 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" podStartSLOduration=1.655615219 podStartE2EDuration="1.655615219s" podCreationTimestamp="2026-04-16 19:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:24:35.65293813 +0000 UTC m=+4048.677980927" watchObservedRunningTime="2026-04-16 19:24:35.655615219 +0000 UTC m=+4048.680658016" Apr 16 19:24:35.756805 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.756773 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8d7db7-222bz_3d96497b-7167-4484-9a0e-c7052ca624e7/registry/0.log" Apr 16 19:24:35.860797 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:35.860771 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v22jh_ebe5bfb0-23f3-4c66-9cc1-2436ea624b37/node-ca/0.log" Apr 16 19:24:36.719243 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:36.719218 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5fc65bbfd4-n74q6_49cc1dab-4e51-4560-8096-2b481c666fa4/router/0.log" Apr 16 19:24:37.106778 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:37.106705 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9c9hd_5226006a-a858-4a19-a1d5-544f65d3a882/serve-healthcheck-canary/0.log" Apr 16 19:24:37.569945 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:37.569915 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xmqgp_1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d/insights-operator/0.log" Apr 16 19:24:37.570839 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:37.570821 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xmqgp_1c91bbcf-fe4a-4f0d-b7c2-41f6d8de652d/insights-operator/1.log" Apr 16 19:24:37.782953 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:37.782928 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rhp8h_d1a9a0b8-ff7a-46a5-90b6-9cac55628412/kube-rbac-proxy/0.log" Apr 16 19:24:37.809950 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:37.809928 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rhp8h_d1a9a0b8-ff7a-46a5-90b6-9cac55628412/exporter/0.log" Apr 16 19:24:37.844279 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:37.844216 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rhp8h_d1a9a0b8-ff7a-46a5-90b6-9cac55628412/extractor/0.log" Apr 16 19:24:39.998859 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:39.998834 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7668d57578-b2x86_41acd827-8fd4-405c-bd27-e46042b8ded8/manager/0.log" Apr 16 19:24:40.671389 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:40.671358 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-bkjzk_2916cac8-77b6-4f16-bb6b-e15da6e85c93/seaweedfs-tls-serving/0.log" Apr 16 19:24:41.634700 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:41.634670 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-js277" Apr 16 19:24:46.841132 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:46.841105 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64mfc_711c83e1-762b-4a01-8f25-65c6c4407f6d/kube-multus-additional-cni-plugins/0.log" Apr 16 19:24:46.868430 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:46.868404 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64mfc_711c83e1-762b-4a01-8f25-65c6c4407f6d/egress-router-binary-copy/0.log" Apr 16 19:24:46.895253 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:46.895225 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64mfc_711c83e1-762b-4a01-8f25-65c6c4407f6d/cni-plugins/0.log" Apr 16 19:24:46.919139 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:46.919111 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64mfc_711c83e1-762b-4a01-8f25-65c6c4407f6d/bond-cni-plugin/0.log" Apr 16 19:24:46.951680 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:46.951657 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64mfc_711c83e1-762b-4a01-8f25-65c6c4407f6d/routeoverride-cni/0.log" Apr 16 19:24:46.972948 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:46.972927 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64mfc_711c83e1-762b-4a01-8f25-65c6c4407f6d/whereabouts-cni-bincopy/0.log" Apr 16 19:24:46.998004 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:46.997968 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64mfc_711c83e1-762b-4a01-8f25-65c6c4407f6d/whereabouts-cni/0.log" Apr 16 19:24:47.524936 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:47.524908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjsh6_6a0f7718-67dc-4701-a6d9-a0f852eb4441/kube-multus/0.log" Apr 16 19:24:47.666905 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:47.666880 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kn4cv_8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7/network-metrics-daemon/0.log" Apr 16 19:24:47.695725 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:47.695704 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kn4cv_8a0f68c0-813b-43dc-ae6c-b7aff3eb0ee7/kube-rbac-proxy/0.log" Apr 16 19:24:48.857315 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:48.857286 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-controller/0.log" Apr 16 19:24:48.883496 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:48.883458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/0.log" Apr 16 19:24:48.901224 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:48.901187 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovn-acl-logging/1.log" Apr 16 19:24:48.930320 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:48.930291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/kube-rbac-proxy-node/0.log" Apr 16 19:24:48.964847 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:48.964818 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:24:49.006118 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:49.006094 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/northd/0.log" Apr 16 19:24:49.035757 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:49.035732 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/nbdb/0.log" Apr 16 19:24:49.091927 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:49.091896 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/sbdb/0.log" Apr 16 19:24:49.232230 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:49.232205 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vpdh_8c9b6947-3fb0-4af0-baf5-9af029c0ab42/ovnkube-controller/0.log" Apr 16 19:24:50.960015 ip-10-0-138-175 kubenswrapper[2572]: I0416 19:24:50.959968 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rbqgb_f3ea7281-e2af-4d20-96b3-8d0ff6b62ef5/network-check-target-container/0.log"