Apr 16 19:53:30.060170 ip-10-0-137-239 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:30.554608 ip-10-0-137-239 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:30.554608 ip-10-0-137-239 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:30.554608 ip-10-0-137-239 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:30.554608 ip-10-0-137-239 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:30.554608 ip-10-0-137-239 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:30.557482 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.557399 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:30.561486 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561470 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:30.561486 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561486 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561490 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561493 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561496 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561498 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561501 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561504 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561507 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561510 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561513 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561516 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561519 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561521 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561524 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561526 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561529 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561539 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561542 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561547 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561551 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:30.561549 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561554 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561557 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561561 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561564 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561567 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561570 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561572 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561575 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561578 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561580 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561583 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561586 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561588 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561591 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561594 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561596 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561599 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561601 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561604 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561606 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:30.562038 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561609 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561611 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561614 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561617 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561621 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561624 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561627 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561630 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561632 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561635 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561637 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561640 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561642 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561645 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561649 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561652 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561654 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561657 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561659 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561662 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:30.562522 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561665 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561667 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561670 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561674 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561678 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561682 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561684 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561687 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561691 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561694 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561696 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561699 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561701 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561704 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561707 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561709 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561712 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561714 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561717 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:30.563029 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561719 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:30.563489 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561722 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:30.563489 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561725 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:30.563489 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561728 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:30.563489 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561730 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:30.563489 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.561733 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563524 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563532 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563535 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563539 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563542 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563545 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563547 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563550 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563552 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563555 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563557 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563560 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563563 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563566 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563569 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563572 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563574 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563577 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563579 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:30.563616 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563582 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563584 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563587 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563589 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563592 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563594 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563597 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563599 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563602 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563604 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563607 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563609 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563611 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563616 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563621 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563624 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563627 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563630 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563633 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:30.564124 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563636 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563638 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563641 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563643 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563646 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563648 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563651 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563654 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563657 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563659 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563662 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563664 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563667 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563669 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563672 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563674 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563677 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563680 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563682 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563686 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:30.564607 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563689 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563691 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563694 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563696 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563699 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563704 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563707 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563710 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563713 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563716 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563718 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563721 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563724 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563726 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563729 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563731 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563734 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563737 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563739 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:30.565126 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563742 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563745 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563747 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563763 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563766 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563769 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563772 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563774 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.563777 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.563846 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.563853 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564461 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564467 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564473 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564479 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564485 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564490 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564493 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564497 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564500 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564504 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564507 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:30.565587 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564510 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564513 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564516 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564519 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564522 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564525 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564529 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564532 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564535 2574 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564537 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564541 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564545 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564549 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564553 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564556 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564559 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564562 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564565 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564568 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564571 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564576 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564579 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564582 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564584 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564588 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:30.566141 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564591 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564596 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564599 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564602 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564605 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564609 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564613 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564616 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564620 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564623 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564626 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564629 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564632 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564635 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564638 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564641 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564644 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564648 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564651 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564654 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564658 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564661 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564665 2574 flags.go:64] FLAG: --help="false" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564669 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564672 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:30.566776 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564675 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564678 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564681 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564685 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564688 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564691 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564694 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564696 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564699 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564702 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564705 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564708 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564711 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564715 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564717 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564720 2574 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564723 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564726 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564730 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564735 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564738 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564741 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564744 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:30.567386 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564747 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564765 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564769 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564772 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564776 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564794 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564804 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564807 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564810 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564813 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564816 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564820 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564823 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564826 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564833 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564836 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564838 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564841 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564844 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564850 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564853 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564856 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564859 2574 flags.go:64] FLAG: --port="10250" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564862 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:30.567984 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564865 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0786e21af704d5538" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564869 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564872 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564875 2574 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564878 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564881 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564885 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564887 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564890 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564893 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564896 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564899 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564902 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564905 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564909 2574 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564912 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564915 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564918 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564921 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564924 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564927 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564930 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564933 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564940 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564943 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564946 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:30.568552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564949 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564952 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564955 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564958 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564964 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564967 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564970 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564974 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564977 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564980 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564983 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564986 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564989 2574 flags.go:64] FLAG: --v="2" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564994 2574 flags.go:64] FLAG: --version="false" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.564998 2574 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.565002 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.565006 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565103 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565107 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565110 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565113 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565116 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565119 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565123 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:30.569310 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565125 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565128 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565131 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565134 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565137 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565141 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565144 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565146 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565150 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565154 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565157 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565160 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565162 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565165 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565168 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565170 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565173 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565176 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565178 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:30.569938 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565181 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565183 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565186 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565188 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565191 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565194 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565196 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565199 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565202 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565206 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565209 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565213 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565216 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565219 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565222 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565225 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565227 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565230 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565234 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565237 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:30.570420 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565240 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565243 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565245 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565248 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565250 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565253 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565256 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565258 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565261 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565264 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565268 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565271 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565273 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565276 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565278 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565281 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565283 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565286 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565289 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565291 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:30.570937 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565294 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565296 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565299 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565302 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565305 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565307 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565310 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565313 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565315 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565318 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565321 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565324 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565327 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565329 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565332 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565335 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565337 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565340 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565342 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:30.571440 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.565345 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:30.571929 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.566101 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:30.575164 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.575138 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:30.575164 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.575164 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575223 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575229 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575233 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575236 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575239 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575243 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575245 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575248 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575251 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575254 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:30.575252 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575256 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575259 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575262 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575265 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575267 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575270 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575273 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575276 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575279 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575281 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575284 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575287 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575289 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575292 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575295 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575297 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575300 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575302 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575305 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575307 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:30.575519 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575310 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575318 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575321 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575323 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575326 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575329 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575331 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575333 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575336 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575338 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575341 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575343 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575346 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575349 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575352 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575355 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575357 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575360 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575362 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575365 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:30.576034 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575368 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575370 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575373 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575375 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575378 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575381 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575383 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575386 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575389 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575391 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575394 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575396 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575399 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575401 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575409 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575412 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575415 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575417 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575420 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:30.576561 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575422 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575425 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575428 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575431 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575433 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575435 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575438 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575441 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575444 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575446 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575451 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575457 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575461 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575465 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575468 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575470 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:30.577053 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575473 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.575479 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575600 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575606 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575609 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575612 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575615 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575619 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575623 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575627 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575630 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575632 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575643 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575646 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575649 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:30.577451 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575652 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575655 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575658 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575660 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575663 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575666 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575668 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575671 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575674 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575677 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575680 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575683 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575685 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575688 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575690 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575693 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575695 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575698 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575700 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575704 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:30.577846 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575706 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575709 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575711 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575714 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575717 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575719 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575722 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575724 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575727 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575729 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575738 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575741 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575744 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575746 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575749 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575768 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575772 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575775 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575778 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575780 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:30.578327 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575783 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575786 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575788 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575791 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575793 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575796 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575798 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575801 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575804 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575806 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575809 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575811 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575814 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575818 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575821 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575824 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575826 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575829 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575831 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575834 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:30.578821 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575837 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575839 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575842 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575850 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575854 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575856 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575859 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575861 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575864 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575867 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575870 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575872 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:30.575875 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.575880 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:30.579315 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.576772 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:30.581175 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.581158 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:30.582157 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.582145 2574 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:30.582279 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.582257 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:30.582337 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.582309 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:30.609775 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.609729 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:30.617669 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.617642 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:30.635024 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.635000 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:30.642073 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.642053 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:30.642638 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.642623 2574 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:30.643899 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.643880 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:30.651910 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.651889 2574 fs.go:135] Filesystem UUIDs: map[05896e76-e6c2-4938-acb8-9f9e133b9af8:/dev/nvme0n1p3 3c3ae066-ea5e-4cb6-9bc7-dc02cd52b065:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 19:53:30.651981 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.651910 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:30.658235 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.658104 2574 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:30.655963584 +0000 UTC m=+0.460150452 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096201 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22cd3e5989e55df57c39a720e30475 SystemUUID:ec22cd3e-5989-e55d-f57c-39a720e30475 BootID:2c6b7d7f-8119-41df-b17d-d2e9e95e9adb Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fc:b5:7a:ca:f7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fc:b5:7a:ca:f7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7a:ef:4b:82:c9:92 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:30.658235 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.658230 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:30.658351 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.658315 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:30.659540 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.659516 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:30.659685 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.659543 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-239.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:30.659734 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.659695 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:30.659734 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.659703 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:30.659734 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.659717 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:30.659734 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.659732 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:30.660590 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.660579 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:30.660711 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.660702 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:30.663815 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.663802 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:30.663881 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.663819 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:30.663881 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.663833 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:30.663881 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.663842 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:30.663881 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.663850 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:30.665095 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.665083 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:30.665139 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.665102 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:30.669123 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.669086 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:30.671589 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.671549 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:30.675118 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675100 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675122 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675129 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675137 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675143 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675149 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675155 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675168 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675176 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675182 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:30.675186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675191 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:30.675511 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675200 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:30.675511 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675221 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:30.675511 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.675226 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:30.676669 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.676645 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:30.676770 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.676664 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-239.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:30.676770 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.676649 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-239.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:30.678857 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.678845 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:30.678909 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.678886 2574 server.go:1295] "Started kubelet" Apr 16 19:53:30.679007 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.678963 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:30.679094 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.679033 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:30.679132 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.679096 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:30.679736 ip-10-0-137-239 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:30.681063 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.681042 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:30.682383 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.682366 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:30.684390 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.684373 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-frxbg" Apr 16 19:53:30.687359 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.687332 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:30.688058 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.688038 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:30.688784 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.688746 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:30.688863 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.688802 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:30.688863 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.688820 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:30.689000 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.688983 2574 factory.go:55] Registering systemd factory Apr 16 19:53:30.689067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689003 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:30.689067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689012 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:30.689067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689048 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:30.689251 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.689229 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:30.689368 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689351 2574 factory.go:153] Registering CRI-O factory Apr 16 19:53:30.689435 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689371 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:30.689435 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689421 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:30.689532 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689444 2574 factory.go:103] Registering Raw factory Apr 16 19:53:30.689532 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689482 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:30.689931 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.689918 2574 manager.go:319] Starting recovery of all containers Apr 16 19:53:30.690513 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.690477 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:30.692815 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.692788 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-frxbg" Apr 16 19:53:30.695066 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.695041 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:53:30.695272 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.695242 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-239.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:53:30.695569 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.695531 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:30.696125 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.695087 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-239.ec2.internal.18a6ee62b0744e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-239.ec2.internal,UID:ip-10-0-137-239.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-239.ec2.internal,},FirstTimestamp:2026-04-16 19:53:30.678857225 +0000 UTC m=+0.483044086,LastTimestamp:2026-04-16 19:53:30.678857225 +0000 UTC m=+0.483044086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-239.ec2.internal,}" Apr 16 19:53:30.701926 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.701736 2574 manager.go:324] Recovery completed Apr 16 19:53:30.704810 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.704780 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 19:53:30.707873 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.707857 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:30.710486 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.710471 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:30.710547 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.710499 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:30.710547 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.710528 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:30.711072 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.711058 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:30.711072 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.711070 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:30.711174 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.711086 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:30.713641 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.713629 2574 policy_none.go:49] "None policy: Start" Apr 16 19:53:30.713692 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.713645 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:30.713692 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.713655 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:30.761431 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.761413 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:30.761594 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.761449 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:30.761594 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.761460 2574 server.go:85] "Starting device plugin registration server" Apr 16 19:53:30.761714 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.761702 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:30.761783 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.761716 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:30.761861 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.761845 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:30.761934 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.761923 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:30.761934 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.761933 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:30.762565 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.762546 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:30.762667 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.762580 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:30.788822 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.788785 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:30.788822 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.788821 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:30.789033 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.788849 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:30.789033 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.788858 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:30.789033 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.788897 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:30.791487 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.791464 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:30.862634 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.862593 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:30.863617 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.863603 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:30.863684 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.863635 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:30.863684 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.863646 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:30.863684 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.863671 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.872031 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.872010 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.872089 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.872037 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-239.ec2.internal\": node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:30.887698 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.887674 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:30.889805 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.889779 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal"] Apr 16 19:53:30.889863 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.889853 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:30.891446 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.891430 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:30.891533 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.891464 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:30.891533 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.891480 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:30.893130 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893116 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:30.893263 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893248 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.893310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893291 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:30.893930 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893903 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:30.893990 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893910 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:30.893990 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893963 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:30.893990 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893976 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:30.893990 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.893939 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:30.894126 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.894002 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:30.895247 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.895233 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.895288 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.895266 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:30.895976 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.895963 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:30.896054 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.895985 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:30.896054 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.895994 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:30.919384 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.919357 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-239.ec2.internal\" not found" node="ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.923291 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.923275 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-239.ec2.internal\" not found" node="ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.988304 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:30.988275 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:30.990499 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.990476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ceda151090eda3f2f692c4ff94e6d53-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal\" (UID: \"7ceda151090eda3f2f692c4ff94e6d53\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.990556 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.990518 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ceda151090eda3f2f692c4ff94e6d53-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal\" (UID: \"7ceda151090eda3f2f692c4ff94e6d53\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:30.990556 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:30.990536 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f370b553c7c2da44374976f1160c1b70-config\") pod \"kube-apiserver-proxy-ip-10-0-137-239.ec2.internal\" (UID: \"f370b553c7c2da44374976f1160c1b70\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.089362 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.089319 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.091530 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.091506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ceda151090eda3f2f692c4ff94e6d53-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal\" (UID: \"7ceda151090eda3f2f692c4ff94e6d53\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.091588 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.091540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ceda151090eda3f2f692c4ff94e6d53-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal\" (UID: \"7ceda151090eda3f2f692c4ff94e6d53\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.091588 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.091557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f370b553c7c2da44374976f1160c1b70-config\") pod \"kube-apiserver-proxy-ip-10-0-137-239.ec2.internal\" (UID: \"f370b553c7c2da44374976f1160c1b70\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.091658 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.091589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ceda151090eda3f2f692c4ff94e6d53-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal\" (UID: \"7ceda151090eda3f2f692c4ff94e6d53\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.091658 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.091590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ceda151090eda3f2f692c4ff94e6d53-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal\" (UID: \"7ceda151090eda3f2f692c4ff94e6d53\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.091658 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.091614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f370b553c7c2da44374976f1160c1b70-config\") pod \"kube-apiserver-proxy-ip-10-0-137-239.ec2.internal\" (UID: \"f370b553c7c2da44374976f1160c1b70\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.189945 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.189907 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.221189 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.221168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.225918 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.225897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" Apr 16 19:53:31.290798 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.290732 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.391174 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.391130 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.491798 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.491704 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.583197 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.583161 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:31.583701 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.583328 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:31.592468 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.592441 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.687723 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.687688 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:31.693122 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.693098 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.695259 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.695236 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:30 +0000 UTC" deadline="2027-09-26 05:05:29.253365667 +0000 UTC" Apr 16 19:53:31.695323 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.695260 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12657h11m57.558109338s" Apr 16 19:53:31.697609 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.697591 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:31.716767 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.716735 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wbdjt" Apr 16 19:53:31.725231 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.725212 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wbdjt" Apr 16 19:53:31.794117 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.794052 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.839662 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:31.839622 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf370b553c7c2da44374976f1160c1b70.slice/crio-8845778f4b2f3488e6dc88854da963162bd3f4ed0d2125d9d3b8e0206953976f WatchSource:0}: Error finding container 8845778f4b2f3488e6dc88854da963162bd3f4ed0d2125d9d3b8e0206953976f: Status 404 returned error can't find the container with id 8845778f4b2f3488e6dc88854da963162bd3f4ed0d2125d9d3b8e0206953976f Apr 16 19:53:31.841139 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:31.841111 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ceda151090eda3f2f692c4ff94e6d53.slice/crio-65df850a355ef69fdab19afa465c33aeea59c1f0e34c9a6a1ec254d9caf06983 WatchSource:0}: Error finding container 65df850a355ef69fdab19afa465c33aeea59c1f0e34c9a6a1ec254d9caf06983: Status 404 returned error can't find the container with id 65df850a355ef69fdab19afa465c33aeea59c1f0e34c9a6a1ec254d9caf06983 Apr 16 19:53:31.844265 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.844243 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:31.894610 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.894559 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:31.941040 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:31.941012 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:31.995654 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:31.995613 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-239.ec2.internal\" not found" Apr 16 19:53:32.053895 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.053818 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:32.089307 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.089280 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" Apr 16 19:53:32.103062 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.103037 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:32.104000 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.103984 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" Apr 16 19:53:32.110012 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.109996 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:32.297389 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.297357 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:32.458939 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.458908 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:32.665974 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.665804 2574 apiserver.go:52] "Watching apiserver" Apr 16 19:53:32.674576 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.674420 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:32.675650 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.675608 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-56m94","openshift-ovn-kubernetes/ovnkube-node-726jz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6","openshift-cluster-node-tuning-operator/tuned-92xs8","openshift-multus/multus-x4r79","openshift-network-operator/iptables-alerter-2xlpf","kube-system/konnectivity-agent-qsmh9","kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal","openshift-image-registry/node-ca-l8zj8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal","openshift-multus/multus-additional-cni-plugins-5srtm","openshift-multus/network-metrics-daemon-bdzq7"] Apr 16 19:53:32.679088 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.678658 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.680703 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.680684 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:32.680703 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.680701 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sh69q\"" Apr 16 19:53:32.681050 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.681032 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:32.681256 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.681240 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:32.682551 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.682530 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.684412 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.684392 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:32.684619 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.684596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:32.684694 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.684668 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:32.684852 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.684836 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sr4qz\"" Apr 16 19:53:32.685091 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.685074 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:32.685257 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.685243 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:32.686900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.686827 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.688570 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.688550 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:32.688801 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.688776 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4dx9c\"" Apr 16 19:53:32.688911 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.688851 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:32.689072 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.689056 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:32.689263 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.689244 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.690990 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.690916 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:32.691157 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.691110 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:32.691157 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.691124 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:32.691306 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.691274 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:32.691640 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.691623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:32.691729 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.691685 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:32.691998 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.691978 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vkgx9\"" Apr 16 19:53:32.692200 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.692182 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:32.693028 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.693007 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:32.693965 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.693929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.694042 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.694024 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.695775 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.695554 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:32.695775 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.695596 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gjj4h\"" Apr 16 19:53:32.695775 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.695637 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:32.695966 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.695855 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:32.695966 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.695878 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:32.695966 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.695915 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:32.696204 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.696164 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:32.696683 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.696312 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zj89f\"" Apr 16 19:53:32.698601 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.698551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:32.698601 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.698585 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.700415 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700394 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:32.700552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700432 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:32.700552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-ovn\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.700552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a96e2c0-7394-4112-aea4-555bbe913368-host-slash\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.700552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700508 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:32.700552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700525 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s657l\" (UniqueName: \"kubernetes.io/projected/4a96e2c0-7394-4112-aea4-555bbe913368-kube-api-access-s657l\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.700552 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700548 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-75vmn\"" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflqz\" (UniqueName: \"kubernetes.io/projected/884798a9-cac3-41a4-af20-f3c01d50646e-kube-api-access-rflqz\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700617 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03880fb-0202-4f2e-9f09-4064525141cd-host\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-ovnkube-script-lib\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-kubernetes\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700786 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-kubelet\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-systemd-units\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700860 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bcbns\"" Apr 16 19:53:32.700900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-node-log\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-ovnkube-config\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700937 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x525\" (UniqueName: \"kubernetes.io/projected/a03880fb-0202-4f2e-9f09-4064525141cd-kube-api-access-9x525\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.700986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b78a4e77-873d-4057-97dd-587515df3295-ovn-node-metrics-cert\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701013 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-systemd\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-var-lib-kubelet\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-host\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-registration-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-var-lib-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-cni-netd\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701217 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-sys\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-tmp\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701293 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-run-netns\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-etc-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-socket-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.701448 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-device-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701411 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-slash\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-run-ovn-kubernetes\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701453 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-cni-bin\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-systemd\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a03880fb-0202-4f2e-9f09-4064525141cd-serviceca\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-log-socket\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-env-overrides\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysconfig\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysctl-d\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysctl-conf\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-sys-fs\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kfrb\" (UniqueName: \"kubernetes.io/projected/8514ce9f-7048-45ba-a71b-0b89134eb13c-kube-api-access-8kfrb\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701792 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgbjb\" (UniqueName: \"kubernetes.io/projected/b78a4e77-873d-4057-97dd-587515df3295-kube-api-access-cgbjb\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.702216 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701816 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-lib-modules\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.702860 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701830 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rsdm\" (UniqueName: \"kubernetes.io/projected/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-kube-api-access-5rsdm\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.702860 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4a96e2c0-7394-4112-aea4-555bbe913368-iptables-alerter-script\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.702860 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-modprobe-d\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.702860 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701883 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-run\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.702860 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.701911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-tuned\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.725778 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.725718 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:31 +0000 UTC" deadline="2028-02-01 14:38:54.596045202 +0000 UTC" Apr 16 19:53:32.725778 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.725747 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15738h45m21.870301855s" Apr 16 19:53:32.790470 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.790437 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:32.794020 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.793969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" event={"ID":"7ceda151090eda3f2f692c4ff94e6d53","Type":"ContainerStarted","Data":"65df850a355ef69fdab19afa465c33aeea59c1f0e34c9a6a1ec254d9caf06983"} Apr 16 19:53:32.795269 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.795241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" event={"ID":"f370b553c7c2da44374976f1160c1b70","Type":"ContainerStarted","Data":"8845778f4b2f3488e6dc88854da963162bd3f4ed0d2125d9d3b8e0206953976f"} Apr 16 19:53:32.803034 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-kubernetes\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.803034 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-sys\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.803219 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.803219 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:32.803219 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803115 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-kubelet\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.803219 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-systemd-units\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.803219 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b78a4e77-873d-4057-97dd-587515df3295-ovn-node-metrics-cert\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.803219 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-systemd\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.803219 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-var-lib-kubelet\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x525\" (UniqueName: \"kubernetes.io/projected/a03880fb-0202-4f2e-9f09-4064525141cd-kube-api-access-9x525\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-system-cni-dir\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803330 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-hostroot\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803388 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-cni-netd\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-tmp\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803438 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-os-release\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-multus-certs\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-tuned\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803535 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-socket-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.803571 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-device-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-env-overrides\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803601 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-var-lib-kubelet\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysconfig\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.803644 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803671 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-sys\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803700 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-cni-netd\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-systemd-units\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.803732 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs podName:884798a9-cac3-41a4-af20-f3c01d50646e nodeName:}" failed. No retries permitted until 2026-04-16 19:53:33.303698331 +0000 UTC m=+3.107885181 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs") pod "network-metrics-daemon-bdzq7" (UID: "884798a9-cac3-41a4-af20-f3c01d50646e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysctl-d\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-kubelet\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-systemd\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-device-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.803999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-kubernetes\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysctl-conf\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.804115 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-socket-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rsdm\" (UniqueName: \"kubernetes.io/projected/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-kube-api-access-5rsdm\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysconfig\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804226 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysctl-conf\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-cni-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-sysctl-d\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kfrb\" (UniqueName: \"kubernetes.io/projected/8514ce9f-7048-45ba-a71b-0b89134eb13c-kube-api-access-8kfrb\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804411 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-netns\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-modprobe-d\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-cnibin\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-env-overrides\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-system-cni-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dae1add-3800-41a5-8840-7ad27c67bbec-cni-binary-copy\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-daemon-config\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/679903aa-1484-4696-9629-25872bd6f204-konnectivity-ca\") pod \"konnectivity-agent-qsmh9\" (UID: \"679903aa-1484-4696-9629-25872bd6f204\") " pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:32.804956 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804788 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-modprobe-d\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-ovn\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804845 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a96e2c0-7394-4112-aea4-555bbe913368-host-slash\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rflqz\" (UniqueName: \"kubernetes.io/projected/884798a9-cac3-41a4-af20-f3c01d50646e-kube-api-access-rflqz\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03880fb-0202-4f2e-9f09-4064525141cd-host\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-ovn\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.804967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-ovnkube-script-lib\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a96e2c0-7394-4112-aea4-555bbe913368-host-slash\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-conf-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03880fb-0202-4f2e-9f09-4064525141cd-host\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-node-log\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-ovnkube-config\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-host\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-node-log\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-socket-dir-parent\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-k8s-cni-cncf-io\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-registration-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.805721 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-host\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-var-lib-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-run\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-run-netns\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-run-netns\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-var-lib-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-etc-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-ovnkube-script-lib\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-run\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-registration-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-slash\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-etc-openvswitch\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-run-ovn-kubernetes\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b78a4e77-873d-4057-97dd-587515df3295-ovnkube-config\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-slash\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-cni-bin\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.806457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-run-ovn-kubernetes\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-cnibin\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-lib-modules\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-systemd\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.805993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a03880fb-0202-4f2e-9f09-4064525141cd-serviceca\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-log-socket\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806082 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-cni-multus\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-lib-modules\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806110 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-kubelet\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrq7l\" (UniqueName: \"kubernetes.io/projected/6dae1add-3800-41a5-8840-7ad27c67bbec-kube-api-access-vrq7l\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-sys-fs\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-log-socket\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgbjb\" (UniqueName: \"kubernetes.io/projected/b78a4e77-873d-4057-97dd-587515df3295-kube-api-access-cgbjb\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806285 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-os-release\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.807221 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806333 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-cni-bin\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-etc-kubernetes\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-run-systemd\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4a96e2c0-7394-4112-aea4-555bbe913368-iptables-alerter-script\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-cni-bin\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806444 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-cni-binary-copy\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806468 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5frv\" (UniqueName: \"kubernetes.io/projected/cf3e916e-8fa0-480a-b696-be499c883f60-kube-api-access-r5frv\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a03880fb-0202-4f2e-9f09-4064525141cd-serviceca\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b78a4e77-873d-4057-97dd-587515df3295-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8514ce9f-7048-45ba-a71b-0b89134eb13c-sys-fs\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/679903aa-1484-4696-9629-25872bd6f204-agent-certs\") pod \"konnectivity-agent-qsmh9\" (UID: \"679903aa-1484-4696-9629-25872bd6f204\") " pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.806704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s657l\" (UniqueName: \"kubernetes.io/projected/4a96e2c0-7394-4112-aea4-555bbe913368-kube-api-access-s657l\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.807973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.807090 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4a96e2c0-7394-4112-aea4-555bbe913368-iptables-alerter-script\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.808402 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.808053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b78a4e77-873d-4057-97dd-587515df3295-ovn-node-metrics-cert\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.808402 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.808072 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-tmp\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.809407 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.809382 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-etc-tuned\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.811275 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.811176 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:32.811275 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.811203 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:32.811275 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.811218 2574 projected.go:194] Error preparing data for projected volume kube-api-access-v2zqd for pod openshift-network-diagnostics/network-check-target-56m94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:32.811543 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:32.811316 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd podName:fe26e572-de88-4d80-bc05-a05fc220448c nodeName:}" failed. No retries permitted until 2026-04-16 19:53:33.311297985 +0000 UTC m=+3.115484855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v2zqd" (UniqueName: "kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd") pod "network-check-target-56m94" (UID: "fe26e572-de88-4d80-bc05-a05fc220448c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:32.814585 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.814276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x525\" (UniqueName: \"kubernetes.io/projected/a03880fb-0202-4f2e-9f09-4064525141cd-kube-api-access-9x525\") pod \"node-ca-l8zj8\" (UID: \"a03880fb-0202-4f2e-9f09-4064525141cd\") " pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:32.814585 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.814546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kfrb\" (UniqueName: \"kubernetes.io/projected/8514ce9f-7048-45ba-a71b-0b89134eb13c-kube-api-access-8kfrb\") pod \"aws-ebs-csi-driver-node-v9sz6\" (UID: \"8514ce9f-7048-45ba-a71b-0b89134eb13c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:32.814585 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.814554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rsdm\" (UniqueName: \"kubernetes.io/projected/4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4-kube-api-access-5rsdm\") pod \"tuned-92xs8\" (UID: \"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4\") " pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:32.814826 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.814684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflqz\" (UniqueName: \"kubernetes.io/projected/884798a9-cac3-41a4-af20-f3c01d50646e-kube-api-access-rflqz\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:32.816123 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.816066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s657l\" (UniqueName: \"kubernetes.io/projected/4a96e2c0-7394-4112-aea4-555bbe913368-kube-api-access-s657l\") pod \"iptables-alerter-2xlpf\" (UID: \"4a96e2c0-7394-4112-aea4-555bbe913368\") " pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:32.817158 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.817123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgbjb\" (UniqueName: \"kubernetes.io/projected/b78a4e77-873d-4057-97dd-587515df3295-kube-api-access-cgbjb\") pod \"ovnkube-node-726jz\" (UID: \"b78a4e77-873d-4057-97dd-587515df3295\") " pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:32.907161 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-hostroot\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907161 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-os-release\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907380 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-multus-certs\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907380 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-multus-certs\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907380 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-hostroot\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907380 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907303 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-os-release\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907380 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-cni-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-netns\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-cnibin\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-system-cni-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dae1add-3800-41a5-8840-7ad27c67bbec-cni-binary-copy\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-daemon-config\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/679903aa-1484-4696-9629-25872bd6f204-konnectivity-ca\") pod \"konnectivity-agent-qsmh9\" (UID: \"679903aa-1484-4696-9629-25872bd6f204\") " pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-conf-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-socket-dir-parent\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-system-cni-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-k8s-cni-cncf-io\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-cnibin\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-cni-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-cni-multus\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907704 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-netns\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-kubelet\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-cnibin\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrq7l\" (UniqueName: \"kubernetes.io/projected/6dae1add-3800-41a5-8840-7ad27c67bbec-kube-api-access-vrq7l\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-os-release\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907813 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-cni-bin\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907838 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-etc-kubernetes\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-cni-binary-copy\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5frv\" (UniqueName: \"kubernetes.io/projected/cf3e916e-8fa0-480a-b696-be499c883f60-kube-api-access-r5frv\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/679903aa-1484-4696-9629-25872bd6f204-agent-certs\") pod \"konnectivity-agent-qsmh9\" (UID: \"679903aa-1484-4696-9629-25872bd6f204\") " pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:32.907994 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-system-cni-dir\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.907995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-etc-kubernetes\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-os-release\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908102 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf3e916e-8fa0-480a-b696-be499c883f60-system-cni-dir\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-run-k8s-cni-cncf-io\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-cni-bin\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-conf-dir\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-socket-dir-parent\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908271 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-cni-multus\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908313 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-cnibin\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dae1add-3800-41a5-8840-7ad27c67bbec-host-var-lib-kubelet\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dae1add-3800-41a5-8840-7ad27c67bbec-cni-binary-copy\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908386 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-cni-binary-copy\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dae1add-3800-41a5-8840-7ad27c67bbec-multus-daemon-config\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.908741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.908687 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/679903aa-1484-4696-9629-25872bd6f204-konnectivity-ca\") pod \"konnectivity-agent-qsmh9\" (UID: \"679903aa-1484-4696-9629-25872bd6f204\") " pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:32.909425 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.909408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf3e916e-8fa0-480a-b696-be499c883f60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.911064 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.911047 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/679903aa-1484-4696-9629-25872bd6f204-agent-certs\") pod \"konnectivity-agent-qsmh9\" (UID: \"679903aa-1484-4696-9629-25872bd6f204\") " pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:32.916200 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.916179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5frv\" (UniqueName: \"kubernetes.io/projected/cf3e916e-8fa0-480a-b696-be499c883f60-kube-api-access-r5frv\") pod \"multus-additional-cni-plugins-5srtm\" (UID: \"cf3e916e-8fa0-480a-b696-be499c883f60\") " pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:32.916287 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.916206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrq7l\" (UniqueName: \"kubernetes.io/projected/6dae1add-3800-41a5-8840-7ad27c67bbec-kube-api-access-vrq7l\") pod \"multus-x4r79\" (UID: \"6dae1add-3800-41a5-8840-7ad27c67bbec\") " pod="openshift-multus/multus-x4r79" Apr 16 19:53:32.992259 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:32.992166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" Apr 16 19:53:33.002041 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.002016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2xlpf" Apr 16 19:53:33.010851 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.010824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l8zj8" Apr 16 19:53:33.023908 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.023878 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:33.030555 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.030531 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-92xs8" Apr 16 19:53:33.038324 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.038297 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x4r79" Apr 16 19:53:33.045895 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.045874 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:33.051543 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.051525 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5srtm" Apr 16 19:53:33.310323 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.310226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:33.310485 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:33.310364 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:33.310485 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:33.310461 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs podName:884798a9-cac3-41a4-af20-f3c01d50646e nodeName:}" failed. No retries permitted until 2026-04-16 19:53:34.31043989 +0000 UTC m=+4.114626764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs") pod "network-metrics-daemon-bdzq7" (UID: "884798a9-cac3-41a4-af20-f3c01d50646e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:33.411550 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.411508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:33.411726 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:33.411684 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:33.411726 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:33.411706 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:33.411726 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:33.411717 2574 projected.go:194] Error preparing data for projected volume kube-api-access-v2zqd for pod openshift-network-diagnostics/network-check-target-56m94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:33.411879 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:33.411797 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd podName:fe26e572-de88-4d80-bc05-a05fc220448c nodeName:}" failed. No retries permitted until 2026-04-16 19:53:34.411777983 +0000 UTC m=+4.215964849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2zqd" (UniqueName: "kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd") pod "network-check-target-56m94" (UID: "fe26e572-de88-4d80-bc05-a05fc220448c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:33.587588 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.587559 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad5f1fa_6312_4a33_837e_c75d2d7eb5c4.slice/crio-bd78ae71266e27a4ebfbc0517d0a86a658ce0e67a808a5be1592e057a23f1103 WatchSource:0}: Error finding container bd78ae71266e27a4ebfbc0517d0a86a658ce0e67a808a5be1592e057a23f1103: Status 404 returned error can't find the container with id bd78ae71266e27a4ebfbc0517d0a86a658ce0e67a808a5be1592e057a23f1103 Apr 16 19:53:33.588365 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.588341 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78a4e77_873d_4057_97dd_587515df3295.slice/crio-dfcc5cdca95974924b797a3a2afe513d63704a626ae9793b2382e2a7234f22fb WatchSource:0}: Error finding container dfcc5cdca95974924b797a3a2afe513d63704a626ae9793b2382e2a7234f22fb: Status 404 returned error can't find the container with id dfcc5cdca95974924b797a3a2afe513d63704a626ae9793b2382e2a7234f22fb Apr 16 19:53:33.590050 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.590028 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a96e2c0_7394_4112_aea4_555bbe913368.slice/crio-0ba8f64e118bd8ac884daf3c21a49314e448c07332007830734866430c1eee89 WatchSource:0}: Error finding container 0ba8f64e118bd8ac884daf3c21a49314e448c07332007830734866430c1eee89: Status 404 returned error can't find the container with id 0ba8f64e118bd8ac884daf3c21a49314e448c07332007830734866430c1eee89 Apr 16 19:53:33.594247 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.594224 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8514ce9f_7048_45ba_a71b_0b89134eb13c.slice/crio-9fad5e5676cf80f01f78078cf21399e5ad1fd6d90739d9a7ce7e7cfc392bfbe7 WatchSource:0}: Error finding container 9fad5e5676cf80f01f78078cf21399e5ad1fd6d90739d9a7ce7e7cfc392bfbe7: Status 404 returned error can't find the container with id 9fad5e5676cf80f01f78078cf21399e5ad1fd6d90739d9a7ce7e7cfc392bfbe7 Apr 16 19:53:33.594981 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.594960 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3e916e_8fa0_480a_b696_be499c883f60.slice/crio-3269c0a7c3e14b97b703a0f9ac01b47c5627b4ed134446f2faa091006e5ef652 WatchSource:0}: Error finding container 3269c0a7c3e14b97b703a0f9ac01b47c5627b4ed134446f2faa091006e5ef652: Status 404 returned error can't find the container with id 3269c0a7c3e14b97b703a0f9ac01b47c5627b4ed134446f2faa091006e5ef652 Apr 16 19:53:33.596368 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.596330 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03880fb_0202_4f2e_9f09_4064525141cd.slice/crio-a5a38756b09f75c297e93275248e5cda02578ae8290257f89f8737b984851004 WatchSource:0}: Error finding container a5a38756b09f75c297e93275248e5cda02578ae8290257f89f8737b984851004: Status 404 returned error can't find the container with id a5a38756b09f75c297e93275248e5cda02578ae8290257f89f8737b984851004 Apr 16 19:53:33.597856 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.597717 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod679903aa_1484_4696_9629_25872bd6f204.slice/crio-341ea7cae2c06bbb7c3875a3b274881aaa66c77db96254434418de915be768cb WatchSource:0}: Error finding container 341ea7cae2c06bbb7c3875a3b274881aaa66c77db96254434418de915be768cb: Status 404 returned error can't find the container with id 341ea7cae2c06bbb7c3875a3b274881aaa66c77db96254434418de915be768cb Apr 16 19:53:33.598688 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:53:33.598667 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dae1add_3800_41a5_8840_7ad27c67bbec.slice/crio-97ca67a3d10bceafe546ef96be2e3e520f8a703aa7a9e6f66913a207679cf067 WatchSource:0}: Error finding container 97ca67a3d10bceafe546ef96be2e3e520f8a703aa7a9e6f66913a207679cf067: Status 404 returned error can't find the container with id 97ca67a3d10bceafe546ef96be2e3e520f8a703aa7a9e6f66913a207679cf067 Apr 16 19:53:33.726452 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.726401 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:31 +0000 UTC" deadline="2028-01-03 16:30:42.574679966 +0000 UTC" Apr 16 19:53:33.726452 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.726445 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15044h37m8.848237675s" Apr 16 19:53:33.798624 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.798570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2xlpf" event={"ID":"4a96e2c0-7394-4112-aea4-555bbe913368","Type":"ContainerStarted","Data":"0ba8f64e118bd8ac884daf3c21a49314e448c07332007830734866430c1eee89"} Apr 16 19:53:33.799510 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.799477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-92xs8" event={"ID":"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4","Type":"ContainerStarted","Data":"bd78ae71266e27a4ebfbc0517d0a86a658ce0e67a808a5be1592e057a23f1103"} Apr 16 19:53:33.801009 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.800977 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" event={"ID":"f370b553c7c2da44374976f1160c1b70","Type":"ContainerStarted","Data":"423d14642e5ca006fc6c37d61002eb49109beb891f62d182464c88f827ab0c3b"} Apr 16 19:53:33.801867 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.801850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" event={"ID":"8514ce9f-7048-45ba-a71b-0b89134eb13c","Type":"ContainerStarted","Data":"9fad5e5676cf80f01f78078cf21399e5ad1fd6d90739d9a7ce7e7cfc392bfbe7"} Apr 16 19:53:33.802740 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.802715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"dfcc5cdca95974924b797a3a2afe513d63704a626ae9793b2382e2a7234f22fb"} Apr 16 19:53:33.803899 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.803874 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x4r79" event={"ID":"6dae1add-3800-41a5-8840-7ad27c67bbec","Type":"ContainerStarted","Data":"97ca67a3d10bceafe546ef96be2e3e520f8a703aa7a9e6f66913a207679cf067"} Apr 16 19:53:33.804747 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.804720 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qsmh9" event={"ID":"679903aa-1484-4696-9629-25872bd6f204","Type":"ContainerStarted","Data":"341ea7cae2c06bbb7c3875a3b274881aaa66c77db96254434418de915be768cb"} Apr 16 19:53:33.806306 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.806283 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l8zj8" event={"ID":"a03880fb-0202-4f2e-9f09-4064525141cd","Type":"ContainerStarted","Data":"a5a38756b09f75c297e93275248e5cda02578ae8290257f89f8737b984851004"} Apr 16 19:53:33.807099 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.807081 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerStarted","Data":"3269c0a7c3e14b97b703a0f9ac01b47c5627b4ed134446f2faa091006e5ef652"} Apr 16 19:53:33.813730 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:33.813688 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-239.ec2.internal" podStartSLOduration=1.813677072 podStartE2EDuration="1.813677072s" podCreationTimestamp="2026-04-16 19:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:53:33.813221844 +0000 UTC m=+3.617408731" watchObservedRunningTime="2026-04-16 19:53:33.813677072 +0000 UTC m=+3.617864010" Apr 16 19:53:34.319037 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:34.318418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:34.319037 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.318578 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:34.319037 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.318640 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs podName:884798a9-cac3-41a4-af20-f3c01d50646e nodeName:}" failed. No retries permitted until 2026-04-16 19:53:36.318621592 +0000 UTC m=+6.122808447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs") pod "network-metrics-daemon-bdzq7" (UID: "884798a9-cac3-41a4-af20-f3c01d50646e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:34.420358 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:34.419636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:34.420358 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.419832 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:34.420358 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.419862 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:34.420358 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.419879 2574 projected.go:194] Error preparing data for projected volume kube-api-access-v2zqd for pod openshift-network-diagnostics/network-check-target-56m94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:34.420358 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.419938 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd podName:fe26e572-de88-4d80-bc05-a05fc220448c nodeName:}" failed. No retries permitted until 2026-04-16 19:53:36.419920371 +0000 UTC m=+6.224107244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2zqd" (UniqueName: "kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd") pod "network-check-target-56m94" (UID: "fe26e572-de88-4d80-bc05-a05fc220448c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:34.790024 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:34.789943 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:34.790521 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.790491 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:34.796806 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:34.796780 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:34.796962 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:34.796922 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:34.815687 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:34.815554 2574 generic.go:358] "Generic (PLEG): container finished" podID="7ceda151090eda3f2f692c4ff94e6d53" containerID="6483be85ed8cd72fbad7485cc2a94ebdcc2d2db99b0bed988a388720718a8f2c" exitCode=0 Apr 16 19:53:34.816508 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:34.816482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" event={"ID":"7ceda151090eda3f2f692c4ff94e6d53","Type":"ContainerDied","Data":"6483be85ed8cd72fbad7485cc2a94ebdcc2d2db99b0bed988a388720718a8f2c"} Apr 16 19:53:35.820797 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:35.820502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" event={"ID":"7ceda151090eda3f2f692c4ff94e6d53","Type":"ContainerStarted","Data":"465b154cbeff29930fac25abd6cf402dc6120a76ee2f0a9d0447f1cc523a9da1"} Apr 16 19:53:36.335457 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:36.335418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:36.335679 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.335662 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:36.335778 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.335733 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs podName:884798a9-cac3-41a4-af20-f3c01d50646e nodeName:}" failed. No retries permitted until 2026-04-16 19:53:40.335714408 +0000 UTC m=+10.139901260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs") pod "network-metrics-daemon-bdzq7" (UID: "884798a9-cac3-41a4-af20-f3c01d50646e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:36.436378 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:36.436320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:36.436570 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.436534 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:36.436570 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.436552 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:36.436570 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.436564 2574 projected.go:194] Error preparing data for projected volume kube-api-access-v2zqd for pod openshift-network-diagnostics/network-check-target-56m94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:36.436731 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.436631 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd podName:fe26e572-de88-4d80-bc05-a05fc220448c nodeName:}" failed. No retries permitted until 2026-04-16 19:53:40.436610333 +0000 UTC m=+10.240797204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2zqd" (UniqueName: "kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd") pod "network-check-target-56m94" (UID: "fe26e572-de88-4d80-bc05-a05fc220448c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:36.789766 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:36.789668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:36.789924 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:36.789682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:36.789924 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.789848 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:36.790042 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:36.790010 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:38.789562 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:38.789527 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:38.789562 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:38.789539 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:38.790088 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:38.789667 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:38.790088 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:38.789805 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:40.369949 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:40.369893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:40.370385 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.370040 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:40.370385 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.370108 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs podName:884798a9-cac3-41a4-af20-f3c01d50646e nodeName:}" failed. No retries permitted until 2026-04-16 19:53:48.370085905 +0000 UTC m=+18.174272758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs") pod "network-metrics-daemon-bdzq7" (UID: "884798a9-cac3-41a4-af20-f3c01d50646e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:40.471230 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:40.471191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:40.471413 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.471374 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:40.471413 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.471400 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:40.471413 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.471415 2574 projected.go:194] Error preparing data for projected volume kube-api-access-v2zqd for pod openshift-network-diagnostics/network-check-target-56m94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:40.471595 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.471482 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd podName:fe26e572-de88-4d80-bc05-a05fc220448c nodeName:}" failed. No retries permitted until 2026-04-16 19:53:48.471463854 +0000 UTC m=+18.275650707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2zqd" (UniqueName: "kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd") pod "network-check-target-56m94" (UID: "fe26e572-de88-4d80-bc05-a05fc220448c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:40.789818 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:40.789722 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:40.789981 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:40.789813 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:40.789981 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.789880 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:40.789981 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:40.789945 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:41.949634 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:41.949542 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-239.ec2.internal" podStartSLOduration=9.949520253 podStartE2EDuration="9.949520253s" podCreationTimestamp="2026-04-16 19:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:53:35.834252202 +0000 UTC m=+5.638439072" watchObservedRunningTime="2026-04-16 19:53:41.949520253 +0000 UTC m=+11.753707128" Apr 16 19:53:41.950197 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:41.949781 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hk2ln"] Apr 16 19:53:41.958067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:41.958042 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:41.958218 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:41.958146 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:42.084444 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.084418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dfa5879b-9279-4460-929b-8800e9ce40bf-kubelet-config\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.084544 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.084457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dfa5879b-9279-4460-929b-8800e9ce40bf-dbus\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.084544 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.084482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.185603 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.185556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dfa5879b-9279-4460-929b-8800e9ce40bf-dbus\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.185730 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.185623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.185730 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.185724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dfa5879b-9279-4460-929b-8800e9ce40bf-kubelet-config\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.185861 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.185747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dfa5879b-9279-4460-929b-8800e9ce40bf-dbus\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.185861 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:42.185845 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:42.185861 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.185855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dfa5879b-9279-4460-929b-8800e9ce40bf-kubelet-config\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.185992 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:42.185907 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret podName:dfa5879b-9279-4460-929b-8800e9ce40bf nodeName:}" failed. No retries permitted until 2026-04-16 19:53:42.685888612 +0000 UTC m=+12.490075487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret") pod "global-pull-secret-syncer-hk2ln" (UID: "dfa5879b-9279-4460-929b-8800e9ce40bf") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:42.688907 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.688867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:42.689092 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:42.688994 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:42.689092 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:42.689072 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret podName:dfa5879b-9279-4460-929b-8800e9ce40bf nodeName:}" failed. No retries permitted until 2026-04-16 19:53:43.689052813 +0000 UTC m=+13.493239672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret") pod "global-pull-secret-syncer-hk2ln" (UID: "dfa5879b-9279-4460-929b-8800e9ce40bf") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:42.790099 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.790064 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:42.790275 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.790064 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:42.790335 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:42.790270 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:42.790456 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:42.790433 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:42.834601 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.834566 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l8zj8" event={"ID":"a03880fb-0202-4f2e-9f09-4064525141cd","Type":"ContainerStarted","Data":"bdb25fbd8a020ed4ab8780a095944983c1fb28678a7fe1991747c7cc69aa8b34"} Apr 16 19:53:42.846148 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:42.846088 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l8zj8" podStartSLOduration=4.380387359 podStartE2EDuration="12.846073818s" podCreationTimestamp="2026-04-16 19:53:30 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.598315419 +0000 UTC m=+3.402502268" lastFinishedPulling="2026-04-16 19:53:42.064001866 +0000 UTC m=+11.868188727" observedRunningTime="2026-04-16 19:53:42.846004855 +0000 UTC m=+12.650191727" watchObservedRunningTime="2026-04-16 19:53:42.846073818 +0000 UTC m=+12.650260689" Apr 16 19:53:43.697424 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:43.697387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:43.697869 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:43.697543 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:43.697869 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:43.697619 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret podName:dfa5879b-9279-4460-929b-8800e9ce40bf nodeName:}" failed. No retries permitted until 2026-04-16 19:53:45.697597792 +0000 UTC m=+15.501784640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret") pod "global-pull-secret-syncer-hk2ln" (UID: "dfa5879b-9279-4460-929b-8800e9ce40bf") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:43.789912 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:43.789877 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:43.790077 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:43.790011 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:44.789719 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:44.789684 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:44.790162 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:44.789827 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:44.790162 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:44.789880 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:44.790162 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:44.789962 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:45.710309 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:45.710271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:45.710488 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:45.710417 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:45.710488 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:45.710480 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret podName:dfa5879b-9279-4460-929b-8800e9ce40bf nodeName:}" failed. No retries permitted until 2026-04-16 19:53:49.710461821 +0000 UTC m=+19.514648691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret") pod "global-pull-secret-syncer-hk2ln" (UID: "dfa5879b-9279-4460-929b-8800e9ce40bf") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:45.789901 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:45.789876 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:45.790268 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:45.790009 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:45.841882 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:45.841838 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" event={"ID":"8514ce9f-7048-45ba-a71b-0b89134eb13c","Type":"ContainerStarted","Data":"03d33eb66b02ce6c5ff99033bfaf0870c2e2012de3c77160b7c6e6a502cc7fb6"} Apr 16 19:53:45.843241 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:45.843210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qsmh9" event={"ID":"679903aa-1484-4696-9629-25872bd6f204","Type":"ContainerStarted","Data":"52afeeda9e05b1815dc49380f874225e2462c4485f9451ee577ba39c901dd589"} Apr 16 19:53:45.997781 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:45.997671 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:45.998337 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:45.998319 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:46.010400 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:46.010340 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qsmh9" podStartSLOduration=6.537007726 podStartE2EDuration="15.010320853s" podCreationTimestamp="2026-04-16 19:53:31 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.599932114 +0000 UTC m=+3.404118964" lastFinishedPulling="2026-04-16 19:53:42.073245243 +0000 UTC m=+11.877432091" observedRunningTime="2026-04-16 19:53:45.855438483 +0000 UTC m=+15.659625375" watchObservedRunningTime="2026-04-16 19:53:46.010320853 +0000 UTC m=+15.814507718" Apr 16 19:53:46.789369 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:46.789330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:46.789655 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:46.789470 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:46.789655 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:46.789522 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:46.789655 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:46.789607 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:47.789091 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:47.789054 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:47.789526 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:47.789170 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:47.846461 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:47.846433 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:53:48.428036 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:48.427993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:48.428208 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.428159 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:48.428285 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.428241 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs podName:884798a9-cac3-41a4-af20-f3c01d50646e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.428219129 +0000 UTC m=+34.232406002 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs") pod "network-metrics-daemon-bdzq7" (UID: "884798a9-cac3-41a4-af20-f3c01d50646e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:48.528540 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:48.528507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:48.528693 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.528666 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:48.528693 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.528686 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:48.528783 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.528696 2574 projected.go:194] Error preparing data for projected volume kube-api-access-v2zqd for pod openshift-network-diagnostics/network-check-target-56m94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:48.528783 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.528750 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd podName:fe26e572-de88-4d80-bc05-a05fc220448c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.528734532 +0000 UTC m=+34.332921381 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2zqd" (UniqueName: "kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd") pod "network-check-target-56m94" (UID: "fe26e572-de88-4d80-bc05-a05fc220448c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:48.790061 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:48.789984 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:48.790061 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:48.790034 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:48.790460 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.790118 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:48.790460 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:48.790237 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:49.736977 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:49.736936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:49.737129 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:49.737078 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:49.737177 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:49.737140 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret podName:dfa5879b-9279-4460-929b-8800e9ce40bf nodeName:}" failed. No retries permitted until 2026-04-16 19:53:57.737124076 +0000 UTC m=+27.541310940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret") pod "global-pull-secret-syncer-hk2ln" (UID: "dfa5879b-9279-4460-929b-8800e9ce40bf") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:49.789365 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:49.789324 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:49.789553 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:49.789480 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:50.790643 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.790486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:50.791201 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.790564 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:50.791201 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:50.790729 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:50.791201 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:50.790837 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:50.853299 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.853275 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 19:53:50.853554 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.853528 2574 generic.go:358] "Generic (PLEG): container finished" podID="b78a4e77-873d-4057-97dd-587515df3295" containerID="faf97665ad0a798ceec41752eab4d6f8b62d9af55f473f036641ffc9c00dae26" exitCode=1 Apr 16 19:53:50.853637 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.853557 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"3488ad5a54228bce42fad19c1bb440f7fd402a97f5ded5581b35640ff5b84899"} Apr 16 19:53:50.853637 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.853587 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"5da94d0cd76e86c0c908f7632b905a05278a021aae376796ab7f8c7c58440546"} Apr 16 19:53:50.853637 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.853596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerDied","Data":"faf97665ad0a798ceec41752eab4d6f8b62d9af55f473f036641ffc9c00dae26"} Apr 16 19:53:50.853637 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.853606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"be085b87df5d03d4731fcc9de9cb7943398ebba917ed900c7a5f8b90795e650f"} Apr 16 19:53:50.854774 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.854740 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x4r79" event={"ID":"6dae1add-3800-41a5-8840-7ad27c67bbec","Type":"ContainerStarted","Data":"50b1f3e61ee45eefec23ffc047820882606664fa7e095bb2ad6c32a59e1a7315"} Apr 16 19:53:50.856038 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.856021 2574 generic.go:358] "Generic (PLEG): container finished" podID="cf3e916e-8fa0-480a-b696-be499c883f60" containerID="5320bc7bcc1536560e35d2c56c1c188aadb9e87c9df67a94b5154e12825a6950" exitCode=0 Apr 16 19:53:50.856117 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.856076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerDied","Data":"5320bc7bcc1536560e35d2c56c1c188aadb9e87c9df67a94b5154e12825a6950"} Apr 16 19:53:50.857298 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.857282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-92xs8" event={"ID":"4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4","Type":"ContainerStarted","Data":"a0fec711ba02dbe7d066a7468d37e2cccd7539bdd61713e0bbf5e8cb98716a62"} Apr 16 19:53:50.867570 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.867535 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x4r79" podStartSLOduration=3.383034537 podStartE2EDuration="19.867525129s" podCreationTimestamp="2026-04-16 19:53:31 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.600375693 +0000 UTC m=+3.404562555" lastFinishedPulling="2026-04-16 19:53:50.084866295 +0000 UTC m=+19.889053147" observedRunningTime="2026-04-16 19:53:50.866726185 +0000 UTC m=+20.670913054" watchObservedRunningTime="2026-04-16 19:53:50.867525129 +0000 UTC m=+20.671712009" Apr 16 19:53:50.890917 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:50.890837 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-92xs8" podStartSLOduration=3.401612782 podStartE2EDuration="19.890825462s" podCreationTimestamp="2026-04-16 19:53:31 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.589589669 +0000 UTC m=+3.393776530" lastFinishedPulling="2026-04-16 19:53:50.078802349 +0000 UTC m=+19.882989210" observedRunningTime="2026-04-16 19:53:50.890439231 +0000 UTC m=+20.694626101" watchObservedRunningTime="2026-04-16 19:53:50.890825462 +0000 UTC m=+20.695012332" Apr 16 19:53:51.049426 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.049406 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:53:51.773644 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.773513 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:53:51.049422429Z","UUID":"11b5c1b3-e9f0-4cf1-a14f-4babf2092f80","Handler":null,"Name":"","Endpoint":""} Apr 16 19:53:51.776708 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.776684 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:53:51.776708 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.776711 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:53:51.789018 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.788990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:51.789174 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:51.789085 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:51.861548 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.861495 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2xlpf" event={"ID":"4a96e2c0-7394-4112-aea4-555bbe913368","Type":"ContainerStarted","Data":"b761a8d8ec105e186e3d65dcaa0b2d2cd285de75bfba3d13bce6f26479fd0306"} Apr 16 19:53:51.863709 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.863682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" event={"ID":"8514ce9f-7048-45ba-a71b-0b89134eb13c","Type":"ContainerStarted","Data":"bbd035ef28215ae37c053f2f2376f734215d69509f73cdb77a013efea21cc19e"} Apr 16 19:53:51.867358 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.867339 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 19:53:51.867771 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.867727 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"8a4373e4e2ba6f575adf9ee7f68055ba8bae62fe817d85319bc6dc503e1ddc28"} Apr 16 19:53:51.867848 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.867779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"636cd0706963e731a37b42fc6f95fb113265022e88427e3e1dfb1d2e56453029"} Apr 16 19:53:51.872768 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:51.872708 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2xlpf" podStartSLOduration=13.391958577 podStartE2EDuration="21.872694364s" podCreationTimestamp="2026-04-16 19:53:30 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.592687551 +0000 UTC m=+3.396874413" lastFinishedPulling="2026-04-16 19:53:42.073423335 +0000 UTC m=+11.877610200" observedRunningTime="2026-04-16 19:53:51.872504052 +0000 UTC m=+21.676690923" watchObservedRunningTime="2026-04-16 19:53:51.872694364 +0000 UTC m=+21.676881237" Apr 16 19:53:52.789415 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:52.789382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:52.789599 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:52.789426 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:52.789599 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:52.789521 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:52.789715 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:52.789676 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:52.871865 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:52.871825 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" event={"ID":"8514ce9f-7048-45ba-a71b-0b89134eb13c","Type":"ContainerStarted","Data":"68b98cf1b4ee5349cc83826b2c57ee01709278e3c2d08e18f0f21ec743ff9606"} Apr 16 19:53:52.901549 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:52.901495 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9sz6" podStartSLOduration=4.407271606 podStartE2EDuration="22.901479847s" podCreationTimestamp="2026-04-16 19:53:30 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.596054353 +0000 UTC m=+3.400241206" lastFinishedPulling="2026-04-16 19:53:52.090262585 +0000 UTC m=+21.894449447" observedRunningTime="2026-04-16 19:53:52.90102668 +0000 UTC m=+22.705213552" watchObservedRunningTime="2026-04-16 19:53:52.901479847 +0000 UTC m=+22.705666716" Apr 16 19:53:53.790178 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:53.790006 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:53.790377 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:53.790255 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:53.876805 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:53.876750 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 19:53:53.877266 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:53.877190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"604cd1f4d14ce0986e89be3e50cc8ec9ae711c440fc1f4bfb4b2328e8f3574ae"} Apr 16 19:53:54.789971 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:54.789935 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:54.789971 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:54.789965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:54.790233 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:54.790101 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:54.790233 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:54.790215 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:55.789310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:55.789282 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:55.789774 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:55.789378 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:56.789902 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.789662 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:56.790454 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.789671 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:56.790454 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:56.789928 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:56.790454 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:56.790041 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:56.885229 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.885201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 19:53:56.885533 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.885506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"c27c8477640c5c765cc034c123b71bf9b46d62ef2c9f4809b3548e3bc45ed174"} Apr 16 19:53:56.885899 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.885879 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:56.886021 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.886006 2574 scope.go:117] "RemoveContainer" containerID="faf97665ad0a798ceec41752eab4d6f8b62d9af55f473f036641ffc9c00dae26" Apr 16 19:53:56.887233 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.887209 2574 generic.go:358] "Generic (PLEG): container finished" podID="cf3e916e-8fa0-480a-b696-be499c883f60" containerID="1fe6c49eafb7039e6ad34c469d7cfe06be6824fff371bd68cadf4e714930788f" exitCode=0 Apr 16 19:53:56.887340 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.887246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerDied","Data":"1fe6c49eafb7039e6ad34c469d7cfe06be6824fff371bd68cadf4e714930788f"} Apr 16 19:53:56.902066 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:56.902044 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:57.789551 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.789515 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:57.789709 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:57.789629 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:57.802631 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.802601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:57.803008 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:57.802704 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:57.803008 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:57.802769 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret podName:dfa5879b-9279-4460-929b-8800e9ce40bf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.802741184 +0000 UTC m=+43.606928032 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret") pod "global-pull-secret-syncer-hk2ln" (UID: "dfa5879b-9279-4460-929b-8800e9ce40bf") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:57.872515 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.872478 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-56m94"] Apr 16 19:53:57.872664 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.872601 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:57.872706 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:57.872685 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:57.875843 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.875819 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hk2ln"] Apr 16 19:53:57.878598 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.878568 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bdzq7"] Apr 16 19:53:57.878694 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.878680 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:57.878805 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:57.878787 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:57.893159 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.893137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 19:53:57.893505 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.893485 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:57.893623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.893496 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" event={"ID":"b78a4e77-873d-4057-97dd-587515df3295","Type":"ContainerStarted","Data":"34fdb7b73a198fb8db8859be7515220fd5c0ce74c4a889661637bcc911cee2d7"} Apr 16 19:53:57.893623 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:57.893594 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:53:57.893797 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.893783 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:57.893879 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.893808 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:57.909794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.909772 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:53:57.943913 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:57.943853 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" podStartSLOduration=10.392311471 podStartE2EDuration="26.943833245s" podCreationTimestamp="2026-04-16 19:53:31 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.591014633 +0000 UTC m=+3.395201486" lastFinishedPulling="2026-04-16 19:53:50.142536394 +0000 UTC m=+19.946723260" observedRunningTime="2026-04-16 19:53:57.942369975 +0000 UTC m=+27.746556862" watchObservedRunningTime="2026-04-16 19:53:57.943833245 +0000 UTC m=+27.748020115" Apr 16 19:53:58.078643 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:58.078613 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:58.078801 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:58.078788 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:53:58.079253 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:58.079232 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qsmh9" Apr 16 19:53:58.896884 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:58.896851 2574 generic.go:358] "Generic (PLEG): container finished" podID="cf3e916e-8fa0-480a-b696-be499c883f60" containerID="6bf46af16b6c73a6f1b44b8706aa9c4f1de1b0033bb5a5cbfb50fc2d343e722d" exitCode=0 Apr 16 19:53:58.897439 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:58.896932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerDied","Data":"6bf46af16b6c73a6f1b44b8706aa9c4f1de1b0033bb5a5cbfb50fc2d343e722d"} Apr 16 19:53:59.789628 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:59.789386 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:53:59.789628 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:59.789381 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:53:59.789628 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:53:59.789381 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:53:59.789951 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:59.789697 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:53:59.789951 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:59.789810 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:53:59.789951 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:53:59.789905 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:54:00.901964 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:00.901795 2574 generic.go:358] "Generic (PLEG): container finished" podID="cf3e916e-8fa0-480a-b696-be499c883f60" containerID="c8e25b2ac1269b8e766dc06f7efaa7a4f848a509cb04f996a1d53ffeba3e9aea" exitCode=0 Apr 16 19:54:00.902393 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:00.901882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerDied","Data":"c8e25b2ac1269b8e766dc06f7efaa7a4f848a509cb04f996a1d53ffeba3e9aea"} Apr 16 19:54:01.789712 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:01.789676 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:54:01.789712 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:01.789703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:54:01.789939 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:01.789787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:01.789974 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:01.789796 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:54:01.792386 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:01.790295 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:54:01.792386 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:01.790795 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:54:03.789819 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:03.789787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:54:03.790371 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:03.789899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:03.790371 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:03.789901 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hk2ln" podUID="dfa5879b-9279-4460-929b-8800e9ce40bf" Apr 16 19:54:03.790371 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:03.790028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:54:03.790371 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:03.790074 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56m94" podUID="fe26e572-de88-4d80-bc05-a05fc220448c" Apr 16 19:54:03.790371 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:03.790121 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdzq7" podUID="884798a9-cac3-41a4-af20-f3c01d50646e" Apr 16 19:54:03.979739 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:03.979666 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-239.ec2.internal" event="NodeReady" Apr 16 19:54:03.979907 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:03.979823 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:04.017732 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.017700 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr"] Apr 16 19:54:04.037738 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.037544 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6"] Apr 16 19:54:04.037738 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.037697 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:04.040584 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.040551 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 19:54:04.040844 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.040823 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-z8vjn\"" Apr 16 19:54:04.041160 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.041139 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.041828 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.041803 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.054375 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.054349 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctpdp"] Apr 16 19:54:04.054516 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.054499 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" Apr 16 19:54:04.056600 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.056579 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.056745 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.056664 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.057195 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.057173 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-6c9pj\"" Apr 16 19:54:04.073633 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.073603 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg"] Apr 16 19:54:04.073800 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.073784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.075511 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.075489 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 19:54:04.075511 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.075494 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-xm5bm\"" Apr 16 19:54:04.075675 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.075546 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.075732 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.075708 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 19:54:04.076021 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.076001 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.081124 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.081096 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 19:54:04.096332 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.096309 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-747fd5949f-wpqct"] Apr 16 19:54:04.096453 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.096435 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.098410 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.098390 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.098525 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.098410 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 19:54:04.098525 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.098422 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 19:54:04.098525 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.098433 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-t7ppx\"" Apr 16 19:54:04.098525 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.098455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.120302 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.120274 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-t672v"] Apr 16 19:54:04.120447 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.120430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.122518 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.122437 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:04.122518 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.122473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:04.123350 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.123331 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q97t2\"" Apr 16 19:54:04.123823 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.123807 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:04.130545 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.130523 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:04.151067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.151042 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr"] Apr 16 19:54:04.151067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.151070 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6"] Apr 16 19:54:04.151067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.151080 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5"] Apr 16 19:54:04.151308 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.151219 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.153363 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.153341 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.153656 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.153638 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 19:54:04.154157 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.154138 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.155795 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:04.155900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-trusted-ca\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.155900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-serving-cert\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.155900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155873 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 19:54:04.155900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpvlc\" (UniqueName: \"kubernetes.io/projected/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-kube-api-access-fpvlc\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.156096 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-config\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.156096 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155874 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-v7txp\"" Apr 16 19:54:04.156096 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.155976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k67f2\" (UniqueName: \"kubernetes.io/projected/417cddde-9bfc-4522-af0d-b947b69c5362-kube-api-access-k67f2\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:04.156241 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.156159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrc52\" (UniqueName: \"kubernetes.io/projected/4f8ec4e6-16f1-43d3-a059-d546a6492815-kube-api-access-nrc52\") pod \"volume-data-source-validator-7c6cbb6c87-kfft6\" (UID: \"4f8ec4e6-16f1-43d3-a059-d546a6492815\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" Apr 16 19:54:04.160088 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.160066 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 19:54:04.166779 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.166745 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg"] Apr 16 19:54:04.166881 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.166794 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j"] Apr 16 19:54:04.166881 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.166845 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.168705 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.168686 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 19:54:04.168825 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.168776 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 19:54:04.168883 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.168870 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-pptx7\"" Apr 16 19:54:04.168932 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.168890 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.169228 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.169212 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.178564 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.178545 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm"] Apr 16 19:54:04.178706 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.178683 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.180520 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.180488 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 19:54:04.180708 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.180688 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.180829 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.180784 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-sdtlb\"" Apr 16 19:54:04.180829 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.180690 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.180913 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.180862 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 19:54:04.192601 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.192581 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-696c89d9db-bdmt2"] Apr 16 19:54:04.192737 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.192716 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.194750 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.194726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:54:04.194897 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.194877 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.195165 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.195141 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.195244 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.195184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 19:54:04.195487 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.195465 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-wdnng\"" Apr 16 19:54:04.206002 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.205982 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59"] Apr 16 19:54:04.206099 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.206086 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.207743 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.207727 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 19:54:04.207960 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.207945 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 19:54:04.208052 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.207962 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 19:54:04.208119 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.208060 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 19:54:04.208119 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.208081 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.208119 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.208087 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.208263 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.208161 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-fg4fh\"" Apr 16 19:54:04.216886 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.216866 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9fsdq"] Apr 16 19:54:04.217040 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.217022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.221616 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.219128 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 19:54:04.229409 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.229391 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825"] Apr 16 19:54:04.229568 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.229551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:04.231405 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.231348 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.231484 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.231423 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.231521 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.231505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7n2jf\"" Apr 16 19:54:04.231562 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.231522 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:04.241059 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241040 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctpdp"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241065 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747fd5949f-wpqct"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241079 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-t672v"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241091 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241101 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241112 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241123 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-696c89d9db-bdmt2"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241133 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j"] Apr 16 19:54:04.241150 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241149 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825"] Apr 16 19:54:04.241481 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241150 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.241481 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241162 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9fsdq"] Apr 16 19:54:04.241647 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.241539 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-q7wjg"] Apr 16 19:54:04.243835 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.243815 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 19:54:04.243928 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.243838 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 19:54:04.243928 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.243887 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 19:54:04.243928 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.243901 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 19:54:04.253874 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.253853 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q7wjg"] Apr 16 19:54:04.253980 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.253952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.256369 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256346 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.256369 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256365 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8flcv\"" Apr 16 19:54:04.256512 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256373 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:04.256512 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256380 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:04.256512 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256505 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.256697 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631ca511-9b5c-4215-97a0-cd40a1f88ddd-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.256697 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256620 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbxww\" (UniqueName: \"kubernetes.io/projected/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-kube-api-access-hbxww\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.256697 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wppdd\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-kube-api-access-wppdd\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.256866 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256741 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrc52\" (UniqueName: \"kubernetes.io/projected/4f8ec4e6-16f1-43d3-a059-d546a6492815-kube-api-access-nrc52\") pod \"volume-data-source-validator-7c6cbb6c87-kfft6\" (UID: \"4f8ec4e6-16f1-43d3-a059-d546a6492815\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" Apr 16 19:54:04.256866 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkqkx\" (UniqueName: \"kubernetes.io/projected/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-kube-api-access-pkqkx\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.256866 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-config\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.256866 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-bound-sa-token\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.257051 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:04.257051 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-trusted-ca\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.257051 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.256979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.257214 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.257065 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:04.257214 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631ca511-9b5c-4215-97a0-cd40a1f88ddd-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.257214 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-serving-cert\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.257214 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpvlc\" (UniqueName: \"kubernetes.io/projected/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-kube-api-access-fpvlc\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.257214 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.257148 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls podName:417cddde-9bfc-4522-af0d-b947b69c5362 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.75712857 +0000 UTC m=+34.561315423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gthr" (UID: "417cddde-9bfc-4522-af0d-b947b69c5362") : secret "samples-operator-tls" not found Apr 16 19:54:04.257523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-service-ca-bundle\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.257523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257340 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-config\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.257523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-image-registry-private-configuration\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.257523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-certificates\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.257523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257450 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-snapshots\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.257523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-serving-cert\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.257523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k67f2\" (UniqueName: \"kubernetes.io/projected/417cddde-9bfc-4522-af0d-b947b69c5362-kube-api-access-k67f2\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:04.257859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.257859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpdb\" (UniqueName: \"kubernetes.io/projected/631ca511-9b5c-4215-97a0-cd40a1f88ddd-kube-api-access-cjpdb\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.257859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.257859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257622 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-trusted-ca\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.257859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-installation-pull-secrets\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.257859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257693 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-tmp\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.257859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.257717 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26e84dd9-5d40-41dd-95a3-95e087c04263-ca-trust-extracted\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.258372 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.258350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-config\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.258372 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.258363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-trusted-ca\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.261750 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.261728 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-serving-cert\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.267654 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.267631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrc52\" (UniqueName: \"kubernetes.io/projected/4f8ec4e6-16f1-43d3-a059-d546a6492815-kube-api-access-nrc52\") pod \"volume-data-source-validator-7c6cbb6c87-kfft6\" (UID: \"4f8ec4e6-16f1-43d3-a059-d546a6492815\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" Apr 16 19:54:04.268225 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.268203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpvlc\" (UniqueName: \"kubernetes.io/projected/efaa6ad2-6fd5-4047-90af-f4b40a394f8f-kube-api-access-fpvlc\") pod \"console-operator-9d4b6777b-ctpdp\" (UID: \"efaa6ad2-6fd5-4047-90af-f4b40a394f8f\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.268857 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.268837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k67f2\" (UniqueName: \"kubernetes.io/projected/417cddde-9bfc-4522-af0d-b947b69c5362-kube-api-access-k67f2\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:04.358896 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.358859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-service-ca-bundle\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.358896 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.358898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-image-registry-private-configuration\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.359136 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.358999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmbt\" (UniqueName: \"kubernetes.io/projected/c466d975-16d2-4ae9-8d08-159d0c6f360e-kube-api-access-ttmbt\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.359136 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359026 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-default-certificate\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.359136 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359072 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/52f124ce-6f9c-4329-9b94-708c5c6715f6-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.359287 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c466d975-16d2-4ae9-8d08-159d0c6f360e-tmp-dir\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.359287 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-snapshots\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.359287 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.359435 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.359435 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c466d975-16d2-4ae9-8d08-159d0c6f360e-config-volume\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.359538 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-hub\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.359538 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-service-ca-bundle\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.359642 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wppdd\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-kube-api-access-wppdd\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.359642 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.359642 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-stats-auth\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.359642 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-config\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.359851 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-ca\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.359851 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.359851 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359738 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d23d4c10-6686-46c3-bcb7-85ee9826dba3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.359851 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st7r8\" (UniqueName: \"kubernetes.io/projected/879830d9-0b32-47cc-bd4a-5600123b9e43-kube-api-access-st7r8\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.359851 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359814 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgsd7\" (UniqueName: \"kubernetes.io/projected/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-kube-api-access-sgsd7\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.359851 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-certificates\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.360117 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-snapshots\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.360117 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.360117 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.359913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpdb\" (UniqueName: \"kubernetes.io/projected/631ca511-9b5c-4215-97a0-cd40a1f88ddd-kube-api-access-cjpdb\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.360117 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360039 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.360117 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-serving-cert\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.360355 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.360355 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-trusted-ca\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.360355 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.360299 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:04.360355 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.360317 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747fd5949f-wpqct: secret "image-registry-tls" not found Apr 16 19:54:04.360551 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-installation-pull-secrets\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.360551 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stdpr\" (UniqueName: \"kubernetes.io/projected/0ea503dc-2cac-4e5c-82ad-73e2f2babdb3-kube-api-access-stdpr\") pod \"managed-serviceaccount-addon-agent-676775ffb5-dsbxm\" (UID: \"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.360870 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-config\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.360999 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/879830d9-0b32-47cc-bd4a-5600123b9e43-tmp\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.360999 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.360973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-tmp\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.361122 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26e84dd9-5d40-41dd-95a3-95e087c04263-ca-trust-extracted\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.361122 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0ea503dc-2cac-4e5c-82ad-73e2f2babdb3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-676775ffb5-dsbxm\" (UID: \"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.361236 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-trusted-ca\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.361293 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631ca511-9b5c-4215-97a0-cd40a1f88ddd-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.361346 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361295 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26e84dd9-5d40-41dd-95a3-95e087c04263-ca-trust-extracted\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.361346 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbxww\" (UniqueName: \"kubernetes.io/projected/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-kube-api-access-hbxww\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.361440 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.361440 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68hjf\" (UniqueName: \"kubernetes.io/projected/d23d4c10-6686-46c3-bcb7-85ee9826dba3-kube-api-access-68hjf\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.361440 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/879830d9-0b32-47cc-bd4a-5600123b9e43-klusterlet-config\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.361591 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-tmp\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.361591 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.361570 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls podName:26e84dd9-5d40-41dd-95a3-95e087c04263 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.861552264 +0000 UTC m=+34.665739125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls") pod "image-registry-747fd5949f-wpqct" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263") : secret "image-registry-tls" not found Apr 16 19:54:04.361930 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-bound-sa-token\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.362034 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkqkx\" (UniqueName: \"kubernetes.io/projected/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-kube-api-access-pkqkx\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.362034 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.361985 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.362034 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.362191 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75m7l\" (UniqueName: \"kubernetes.io/projected/52f124ce-6f9c-4329-9b94-708c5c6715f6-kube-api-access-75m7l\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.362191 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:04.362191 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hhkn\" (UniqueName: \"kubernetes.io/projected/175b0100-2c40-4ff1-993a-ed325cab1d64-kube-api-access-2hhkn\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:04.362191 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631ca511-9b5c-4215-97a0-cd40a1f88ddd-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.362381 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-image-registry-private-configuration\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.362751 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362711 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631ca511-9b5c-4215-97a0-cd40a1f88ddd-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.362751 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.362934 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.362848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-installation-pull-secrets\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.363084 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.363059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-serving-cert\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.363834 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.363818 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" Apr 16 19:54:04.364047 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.364030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631ca511-9b5c-4215-97a0-cd40a1f88ddd-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.367801 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.367731 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpdb\" (UniqueName: \"kubernetes.io/projected/631ca511-9b5c-4215-97a0-cd40a1f88ddd-kube-api-access-cjpdb\") pod \"kube-storage-version-migrator-operator-6769c5d45-nkcw5\" (UID: \"631ca511-9b5c-4215-97a0-cd40a1f88ddd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.368906 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.368885 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-certificates\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.369547 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.369502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-bound-sa-token\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.369748 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.369729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbxww\" (UniqueName: \"kubernetes.io/projected/76fbc99b-bd4e-4ed8-9580-4a1845bf152f-kube-api-access-hbxww\") pod \"service-ca-operator-d6fc45fc5-p9xvg\" (UID: \"76fbc99b-bd4e-4ed8-9580-4a1845bf152f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.370176 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.370155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkqkx\" (UniqueName: \"kubernetes.io/projected/c5bd5e90-47ac-41b5-bc0c-07feb9989bab-kube-api-access-pkqkx\") pod \"insights-operator-585dfdc468-t672v\" (UID: \"c5bd5e90-47ac-41b5-bc0c-07feb9989bab\") " pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.375075 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.375054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wppdd\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-kube-api-access-wppdd\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.383435 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.383380 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:04.414240 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.414207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" Apr 16 19:54:04.464203 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464002 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-t672v" Apr 16 19:54:04.464203 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stdpr\" (UniqueName: \"kubernetes.io/projected/0ea503dc-2cac-4e5c-82ad-73e2f2babdb3-kube-api-access-stdpr\") pod \"managed-serviceaccount-addon-agent-676775ffb5-dsbxm\" (UID: \"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.464399 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/879830d9-0b32-47cc-bd4a-5600123b9e43-tmp\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.464399 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0ea503dc-2cac-4e5c-82ad-73e2f2babdb3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-676775ffb5-dsbxm\" (UID: \"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.464399 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464296 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.464399 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68hjf\" (UniqueName: \"kubernetes.io/projected/d23d4c10-6686-46c3-bcb7-85ee9826dba3-kube-api-access-68hjf\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.464399 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/879830d9-0b32-47cc-bd4a-5600123b9e43-klusterlet-config\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.464399 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75m7l\" (UniqueName: \"kubernetes.io/projected/52f124ce-6f9c-4329-9b94-708c5c6715f6-kube-api-access-75m7l\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464498 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hhkn\" (UniqueName: \"kubernetes.io/projected/175b0100-2c40-4ff1-993a-ed325cab1d64-kube-api-access-2hhkn\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmbt\" (UniqueName: \"kubernetes.io/projected/c466d975-16d2-4ae9-8d08-159d0c6f360e-kube-api-access-ttmbt\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-default-certificate\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/52f124ce-6f9c-4329-9b94-708c5c6715f6-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c466d975-16d2-4ae9-8d08-159d0c6f360e-tmp-dir\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c466d975-16d2-4ae9-8d08-159d0c6f360e-config-volume\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-hub\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-stats-auth\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-ca\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.467443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d23d4c10-6686-46c3-bcb7-85ee9826dba3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.468468 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-st7r8\" (UniqueName: \"kubernetes.io/projected/879830d9-0b32-47cc-bd4a-5600123b9e43-kube-api-access-st7r8\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.468468 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.464933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgsd7\" (UniqueName: \"kubernetes.io/projected/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-kube-api-access-sgsd7\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.468468 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.465126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.468468 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.465838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c466d975-16d2-4ae9-8d08-159d0c6f360e-config-volume\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.468468 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.465944 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:04.468468 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.466009 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert podName:175b0100-2c40-4ff1-993a-ed325cab1d64 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.965991888 +0000 UTC m=+34.770178753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert") pod "ingress-canary-9fsdq" (UID: "175b0100-2c40-4ff1-993a-ed325cab1d64") : secret "canary-serving-cert" not found Apr 16 19:54:04.468468 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.467380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/879830d9-0b32-47cc-bd4a-5600123b9e43-tmp\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.469056 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.468847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c466d975-16d2-4ae9-8d08-159d0c6f360e-tmp-dir\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.469196 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.469171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0ea503dc-2cac-4e5c-82ad-73e2f2babdb3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-676775ffb5-dsbxm\" (UID: \"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.469493 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.469438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/52f124ce-6f9c-4329-9b94-708c5c6715f6-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.469573 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.469563 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.969544111 +0000 UTC m=+34.773730962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:04.469664 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.469641 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:04.469735 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.469699 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs podName:884798a9-cac3-41a4-af20-f3c01d50646e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.469686469 +0000 UTC m=+66.273873319 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs") pod "network-metrics-daemon-bdzq7" (UID: "884798a9-cac3-41a4-af20-f3c01d50646e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:04.469817 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.469735 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:04.469817 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.469812 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.969795685 +0000 UTC m=+34.773982539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : secret "router-metrics-certs-default" not found Apr 16 19:54:04.470091 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.470072 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:04.470161 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.470122 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls podName:d23d4c10-6686-46c3-bcb7-85ee9826dba3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.970108512 +0000 UTC m=+34.774295366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zlp6j" (UID: "d23d4c10-6686-46c3-bcb7-85ee9826dba3") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:04.470387 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.470368 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:04.470454 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.470419 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls podName:c466d975-16d2-4ae9-8d08-159d0c6f360e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.970404971 +0000 UTC m=+34.774591820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls") pod "dns-default-q7wjg" (UID: "c466d975-16d2-4ae9-8d08-159d0c6f360e") : secret "dns-default-metrics-tls" not found Apr 16 19:54:04.470454 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.469651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d23d4c10-6686-46c3-bcb7-85ee9826dba3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.470835 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.470730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.473651 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.473569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-stats-auth\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.476249 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.475849 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" Apr 16 19:54:04.478035 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.477997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.478731 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.478705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-ca\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.479067 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.479044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75m7l\" (UniqueName: \"kubernetes.io/projected/52f124ce-6f9c-4329-9b94-708c5c6715f6-kube-api-access-75m7l\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.480138 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.479965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-default-certificate\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.484142 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.482791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/52f124ce-6f9c-4329-9b94-708c5c6715f6-hub\") pod \"cluster-proxy-proxy-agent-6c79b8b4fd-dz825\" (UID: \"52f124ce-6f9c-4329-9b94-708c5c6715f6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.484142 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.483346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/879830d9-0b32-47cc-bd4a-5600123b9e43-klusterlet-config\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.492065 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.492038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-st7r8\" (UniqueName: \"kubernetes.io/projected/879830d9-0b32-47cc-bd4a-5600123b9e43-kube-api-access-st7r8\") pod \"klusterlet-addon-workmgr-69689b7f44-mvt59\" (UID: \"879830d9-0b32-47cc-bd4a-5600123b9e43\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.512889 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.496839 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4gptc"] Apr 16 19:54:04.512889 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.508924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stdpr\" (UniqueName: \"kubernetes.io/projected/0ea503dc-2cac-4e5c-82ad-73e2f2babdb3-kube-api-access-stdpr\") pod \"managed-serviceaccount-addon-agent-676775ffb5-dsbxm\" (UID: \"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.512889 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.508998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68hjf\" (UniqueName: \"kubernetes.io/projected/d23d4c10-6686-46c3-bcb7-85ee9826dba3-kube-api-access-68hjf\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.512889 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.509627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hhkn\" (UniqueName: \"kubernetes.io/projected/175b0100-2c40-4ff1-993a-ed325cab1d64-kube-api-access-2hhkn\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:04.512889 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.511858 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" Apr 16 19:54:04.512889 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.512678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmbt\" (UniqueName: \"kubernetes.io/projected/c466d975-16d2-4ae9-8d08-159d0c6f360e-kube-api-access-ttmbt\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.512889 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.512831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgsd7\" (UniqueName: \"kubernetes.io/projected/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-kube-api-access-sgsd7\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.534533 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.533029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:04.550224 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.548850 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.550224 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.549647 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:54:04.552090 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.551862 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-r5qgr\"" Apr 16 19:54:04.570973 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.567035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:04.570973 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.567360 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:04.570973 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.567382 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:04.570973 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.567395 2574 projected.go:194] Error preparing data for projected volume kube-api-access-v2zqd for pod openshift-network-diagnostics/network-check-target-56m94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:04.570973 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.567451 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd podName:fe26e572-de88-4d80-bc05-a05fc220448c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.567432215 +0000 UTC m=+66.371619086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2zqd" (UniqueName: "kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd") pod "network-check-target-56m94" (UID: "fe26e572-de88-4d80-bc05-a05fc220448c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:04.640020 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.639981 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6"] Apr 16 19:54:04.655489 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:04.651115 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f8ec4e6_16f1_43d3_a059_d546a6492815.slice/crio-7f35534535422339ae7583ca37770c5daac1c07f01cd383f642a24f2238eef13 WatchSource:0}: Error finding container 7f35534535422339ae7583ca37770c5daac1c07f01cd383f642a24f2238eef13: Status 404 returned error can't find the container with id 7f35534535422339ae7583ca37770c5daac1c07f01cd383f642a24f2238eef13 Apr 16 19:54:04.655489 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.652884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctpdp"] Apr 16 19:54:04.661593 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:04.661555 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefaa6ad2_6fd5_4047_90af_f4b40a394f8f.slice/crio-68f2b7d2d6518841472b1a287f352cc0aec49aeae7089f4ac1afa83439eef8a1 WatchSource:0}: Error finding container 68f2b7d2d6518841472b1a287f352cc0aec49aeae7089f4ac1afa83439eef8a1: Status 404 returned error can't find the container with id 68f2b7d2d6518841472b1a287f352cc0aec49aeae7089f4ac1afa83439eef8a1 Apr 16 19:54:04.668151 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.668008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kcw9\" (UniqueName: \"kubernetes.io/projected/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-kube-api-access-4kcw9\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.668151 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.668115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-hosts-file\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.668151 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.668141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-tmp-dir\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.681193 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.681147 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg"] Apr 16 19:54:04.716985 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.716924 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-t672v"] Apr 16 19:54:04.729136 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.729086 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5"] Apr 16 19:54:04.732670 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.732501 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm"] Apr 16 19:54:04.734338 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:04.734271 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod631ca511_9b5c_4215_97a0_cd40a1f88ddd.slice/crio-f9eaacb5ca5d50b3cb9e7c4cd54168093fc77aae9adb905ef5af1ae082dc3058 WatchSource:0}: Error finding container f9eaacb5ca5d50b3cb9e7c4cd54168093fc77aae9adb905ef5af1ae082dc3058: Status 404 returned error can't find the container with id f9eaacb5ca5d50b3cb9e7c4cd54168093fc77aae9adb905ef5af1ae082dc3058 Apr 16 19:54:04.739082 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:04.739047 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ea503dc_2cac_4e5c_82ad_73e2f2babdb3.slice/crio-b4fb32a4ff6f28863538ea42aaf5abdbaaa57ee30d5dfd6550a8cff9f2e6c44c WatchSource:0}: Error finding container b4fb32a4ff6f28863538ea42aaf5abdbaaa57ee30d5dfd6550a8cff9f2e6c44c: Status 404 returned error can't find the container with id b4fb32a4ff6f28863538ea42aaf5abdbaaa57ee30d5dfd6550a8cff9f2e6c44c Apr 16 19:54:04.764716 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.764665 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59"] Apr 16 19:54:04.767500 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:04.767461 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879830d9_0b32_47cc_bd4a_5600123b9e43.slice/crio-0eb2df319be42457b24c5aa5ab4ab6dbb657eec7824468830215be481b26d822 WatchSource:0}: Error finding container 0eb2df319be42457b24c5aa5ab4ab6dbb657eec7824468830215be481b26d822: Status 404 returned error can't find the container with id 0eb2df319be42457b24c5aa5ab4ab6dbb657eec7824468830215be481b26d822 Apr 16 19:54:04.768820 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.768645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:04.768820 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.768730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kcw9\" (UniqueName: \"kubernetes.io/projected/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-kube-api-access-4kcw9\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.768979 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.768880 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:04.768979 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.768937 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls podName:417cddde-9bfc-4522-af0d-b947b69c5362 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.768918133 +0000 UTC m=+35.573105007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gthr" (UID: "417cddde-9bfc-4522-af0d-b947b69c5362") : secret "samples-operator-tls" not found Apr 16 19:54:04.769095 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.769083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-hosts-file\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.769209 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.769112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-tmp-dir\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.769264 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.769233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-hosts-file\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.769517 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.769499 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-tmp-dir\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.779870 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.779849 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kcw9\" (UniqueName: \"kubernetes.io/projected/67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca-kube-api-access-4kcw9\") pod \"node-resolver-4gptc\" (UID: \"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca\") " pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.793887 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.793869 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825"] Apr 16 19:54:04.795854 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:04.795830 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f124ce_6f9c_4329_9b94_708c5c6715f6.slice/crio-44b0b928d0feff0bd26de12bae24018e85cfce742b64ec84db6b2f4c59e04743 WatchSource:0}: Error finding container 44b0b928d0feff0bd26de12bae24018e85cfce742b64ec84db6b2f4c59e04743: Status 404 returned error can't find the container with id 44b0b928d0feff0bd26de12bae24018e85cfce742b64ec84db6b2f4c59e04743 Apr 16 19:54:04.870279 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.870250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:04.870463 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.870422 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:04.870463 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.870445 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747fd5949f-wpqct: secret "image-registry-tls" not found Apr 16 19:54:04.870586 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.870513 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls podName:26e84dd9-5d40-41dd-95a3-95e087c04263 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.87049268 +0000 UTC m=+35.674679548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls") pod "image-registry-747fd5949f-wpqct" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263") : secret "image-registry-tls" not found Apr 16 19:54:04.900822 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.900786 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4gptc" Apr 16 19:54:04.911106 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.911075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" event={"ID":"efaa6ad2-6fd5-4047-90af-f4b40a394f8f","Type":"ContainerStarted","Data":"68f2b7d2d6518841472b1a287f352cc0aec49aeae7089f4ac1afa83439eef8a1"} Apr 16 19:54:04.912259 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.912228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" event={"ID":"631ca511-9b5c-4215-97a0-cd40a1f88ddd","Type":"ContainerStarted","Data":"f9eaacb5ca5d50b3cb9e7c4cd54168093fc77aae9adb905ef5af1ae082dc3058"} Apr 16 19:54:04.913415 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.913389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t672v" event={"ID":"c5bd5e90-47ac-41b5-bc0c-07feb9989bab","Type":"ContainerStarted","Data":"b95b284088a92b9ff35e1f5f716d541ea372ea71153ac7e7d09fec789ae8010a"} Apr 16 19:54:04.914395 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.914369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" event={"ID":"52f124ce-6f9c-4329-9b94-708c5c6715f6","Type":"ContainerStarted","Data":"44b0b928d0feff0bd26de12bae24018e85cfce742b64ec84db6b2f4c59e04743"} Apr 16 19:54:04.915422 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.915395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" event={"ID":"879830d9-0b32-47cc-bd4a-5600123b9e43","Type":"ContainerStarted","Data":"0eb2df319be42457b24c5aa5ab4ab6dbb657eec7824468830215be481b26d822"} Apr 16 19:54:04.915660 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:04.915638 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f8454e_2f9f_4bb4_8cc1_afbff30ea2ca.slice/crio-591faf03a625fa66d4a3d443967d47ee59d3ea37400c2f361be5f2b44e4ef797 WatchSource:0}: Error finding container 591faf03a625fa66d4a3d443967d47ee59d3ea37400c2f361be5f2b44e4ef797: Status 404 returned error can't find the container with id 591faf03a625fa66d4a3d443967d47ee59d3ea37400c2f361be5f2b44e4ef797 Apr 16 19:54:04.916582 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.916560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" event={"ID":"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3","Type":"ContainerStarted","Data":"b4fb32a4ff6f28863538ea42aaf5abdbaaa57ee30d5dfd6550a8cff9f2e6c44c"} Apr 16 19:54:04.917650 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.917630 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" event={"ID":"76fbc99b-bd4e-4ed8-9580-4a1845bf152f","Type":"ContainerStarted","Data":"23bd664497c6eb1dccac9c377eeffe721cddb507d788b1f655530adbf90c267e"} Apr 16 19:54:04.918652 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.918628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" event={"ID":"4f8ec4e6-16f1-43d3-a059-d546a6492815","Type":"ContainerStarted","Data":"7f35534535422339ae7583ca37770c5daac1c07f01cd383f642a24f2238eef13"} Apr 16 19:54:04.971443 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.971407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:04.971595 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.971554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.971662 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.971588 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:04.971662 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.971657 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:04.971784 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.971665 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls podName:c466d975-16d2-4ae9-8d08-159d0c6f360e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.971643104 +0000 UTC m=+35.775829969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls") pod "dns-default-q7wjg" (UID: "c466d975-16d2-4ae9-8d08-159d0c6f360e") : secret "dns-default-metrics-tls" not found Apr 16 19:54:04.971784 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.971590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:04.971784 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.971705 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.971686024 +0000 UTC m=+35.775872887 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:04.971784 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.971779 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls podName:d23d4c10-6686-46c3-bcb7-85ee9826dba3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.971737859 +0000 UTC m=+35.775924716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zlp6j" (UID: "d23d4c10-6686-46c3-bcb7-85ee9826dba3") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:04.971993 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.971836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:04.971993 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:04.971889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:04.971993 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.971954 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:04.972109 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.972009 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert podName:175b0100-2c40-4ff1-993a-ed325cab1d64 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.971996373 +0000 UTC m=+35.776183224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert") pod "ingress-canary-9fsdq" (UID: "175b0100-2c40-4ff1-993a-ed325cab1d64") : secret "canary-serving-cert" not found Apr 16 19:54:04.972109 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.972013 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:04.972109 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:04.972062 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.972049279 +0000 UTC m=+35.776236127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : secret "router-metrics-certs-default" not found Apr 16 19:54:05.780274 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.780236 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:05.780545 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.780413 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:05.780545 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.780492 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls podName:417cddde-9bfc-4522-af0d-b947b69c5362 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.78047269 +0000 UTC m=+37.584659553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gthr" (UID: "417cddde-9bfc-4522-af0d-b947b69c5362") : secret "samples-operator-tls" not found Apr 16 19:54:05.791108 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.789516 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:54:05.791108 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.790053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:54:05.791108 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.790443 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:05.793886 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.792031 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:05.793886 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.792250 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:05.793886 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.792566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p7kgs\"" Apr 16 19:54:05.793886 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.792848 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:05.793886 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.793391 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2xst7\"" Apr 16 19:54:05.793886 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.793616 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:05.881602 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.881435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:05.882407 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.881961 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:05.882407 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.881988 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747fd5949f-wpqct: secret "image-registry-tls" not found Apr 16 19:54:05.882407 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.882062 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls podName:26e84dd9-5d40-41dd-95a3-95e087c04263 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.882040644 +0000 UTC m=+37.686227506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls") pod "image-registry-747fd5949f-wpqct" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263") : secret "image-registry-tls" not found Apr 16 19:54:05.943163 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.943124 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4gptc" event={"ID":"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca","Type":"ContainerStarted","Data":"8abde89eac08e6c304decb3d70c9dbeb57952135b116c06196c225aaa6f3bfc6"} Apr 16 19:54:05.943341 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.943174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4gptc" event={"ID":"67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca","Type":"ContainerStarted","Data":"591faf03a625fa66d4a3d443967d47ee59d3ea37400c2f361be5f2b44e4ef797"} Apr 16 19:54:05.960783 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.959922 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4gptc" podStartSLOduration=1.959899751 podStartE2EDuration="1.959899751s" podCreationTimestamp="2026-04-16 19:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:05.95965948 +0000 UTC m=+35.763846353" watchObservedRunningTime="2026-04-16 19:54:05.959899751 +0000 UTC m=+35.764086623" Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.982005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.982051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.982083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.982143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:05.982195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.982323 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.982386 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls podName:c466d975-16d2-4ae9-8d08-159d0c6f360e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.98236785 +0000 UTC m=+37.786554720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls") pod "dns-default-q7wjg" (UID: "c466d975-16d2-4ae9-8d08-159d0c6f360e") : secret "dns-default-metrics-tls" not found Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.983413 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.983396578 +0000 UTC m=+37.787583446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.983492 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.983529 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls podName:d23d4c10-6686-46c3-bcb7-85ee9826dba3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.983517401 +0000 UTC m=+37.787704251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zlp6j" (UID: "d23d4c10-6686-46c3-bcb7-85ee9826dba3") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.983596 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.983627 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert podName:175b0100-2c40-4ff1-993a-ed325cab1d64 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.983615526 +0000 UTC m=+37.787802390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert") pod "ingress-canary-9fsdq" (UID: "175b0100-2c40-4ff1-993a-ed325cab1d64") : secret "canary-serving-cert" not found Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.983699 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:05.983781 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:05.983728 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.983718632 +0000 UTC m=+37.787905493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : secret "router-metrics-certs-default" not found Apr 16 19:54:06.526259 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.525073 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97"] Apr 16 19:54:06.540647 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.539746 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97"] Apr 16 19:54:06.540647 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.539930 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" Apr 16 19:54:06.542360 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.542153 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-8trwv\"" Apr 16 19:54:06.692101 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.692063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m44w\" (UniqueName: \"kubernetes.io/projected/7663f14f-d056-4025-928a-0110e843ff4c-kube-api-access-7m44w\") pod \"network-check-source-8894fc9bd-qqm97\" (UID: \"7663f14f-d056-4025-928a-0110e843ff4c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" Apr 16 19:54:06.793741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.793129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m44w\" (UniqueName: \"kubernetes.io/projected/7663f14f-d056-4025-928a-0110e843ff4c-kube-api-access-7m44w\") pod \"network-check-source-8894fc9bd-qqm97\" (UID: \"7663f14f-d056-4025-928a-0110e843ff4c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" Apr 16 19:54:06.820535 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.820469 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m44w\" (UniqueName: \"kubernetes.io/projected/7663f14f-d056-4025-928a-0110e843ff4c-kube-api-access-7m44w\") pod \"network-check-source-8894fc9bd-qqm97\" (UID: \"7663f14f-d056-4025-928a-0110e843ff4c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" Apr 16 19:54:06.874204 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:06.873491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" Apr 16 19:54:07.804967 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:07.804192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:07.804967 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:07.804428 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:07.804967 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:07.804495 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls podName:417cddde-9bfc-4522-af0d-b947b69c5362 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:11.804475956 +0000 UTC m=+41.608662813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gthr" (UID: "417cddde-9bfc-4522-af0d-b947b69c5362") : secret "samples-operator-tls" not found Apr 16 19:54:07.906200 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:07.905466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:07.906200 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:07.905741 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:07.906200 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:07.905774 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747fd5949f-wpqct: secret "image-registry-tls" not found Apr 16 19:54:07.906200 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:07.905837 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls podName:26e84dd9-5d40-41dd-95a3-95e087c04263 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:11.905815326 +0000 UTC m=+41.710002180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls") pod "image-registry-747fd5949f-wpqct" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263") : secret "image-registry-tls" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:08.006338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:08.006481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:08.006513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:08.006544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:08.006595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.006735 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.006814 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.006794198 +0000 UTC m=+41.810981049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : secret "router-metrics-certs-default" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.007231 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.007274 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls podName:c466d975-16d2-4ae9-8d08-159d0c6f360e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.007260677 +0000 UTC m=+41.811447529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls") pod "dns-default-q7wjg" (UID: "c466d975-16d2-4ae9-8d08-159d0c6f360e") : secret "dns-default-metrics-tls" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.007349 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.007338655 +0000 UTC m=+41.811525508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.007405 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.007439 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls podName:d23d4c10-6686-46c3-bcb7-85ee9826dba3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.007428902 +0000 UTC m=+41.811615756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zlp6j" (UID: "d23d4c10-6686-46c3-bcb7-85ee9826dba3") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.007499 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:08.007564 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:08.007527 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert podName:175b0100-2c40-4ff1-993a-ed325cab1d64 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.007517159 +0000 UTC m=+41.811704009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert") pod "ingress-canary-9fsdq" (UID: "175b0100-2c40-4ff1-993a-ed325cab1d64") : secret "canary-serving-cert" not found Apr 16 19:54:11.444684 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.444644 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vrm46"] Apr 16 19:54:11.447633 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.447612 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:11.449499 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.449475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 19:54:11.449609 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.449500 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 19:54:11.449609 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.449519 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fd9zf\"" Apr 16 19:54:11.455425 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.455349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vrm46"] Apr 16 19:54:11.543305 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.543269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/57623683-701c-481e-a07f-ba6e226f7785-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:11.543499 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.543377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:11.645063 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.645033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/57623683-701c-481e-a07f-ba6e226f7785-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:11.645259 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.645084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:11.645259 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:11.645192 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:11.645259 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:11.645259 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert podName:57623683-701c-481e-a07f-ba6e226f7785 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.145241995 +0000 UTC m=+41.949428842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vrm46" (UID: "57623683-701c-481e-a07f-ba6e226f7785") : secret "networking-console-plugin-cert" not found Apr 16 19:54:11.645801 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.645778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/57623683-701c-481e-a07f-ba6e226f7785-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:11.846339 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.846297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:11.846535 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:11.846454 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:11.846535 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:11.846531 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls podName:417cddde-9bfc-4522-af0d-b947b69c5362 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:19.84651141 +0000 UTC m=+49.650698262 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gthr" (UID: "417cddde-9bfc-4522-af0d-b947b69c5362") : secret "samples-operator-tls" not found Apr 16 19:54:11.947908 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:11.947870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:11.948067 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:11.948042 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:11.948067 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:11.948066 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747fd5949f-wpqct: secret "image-registry-tls" not found Apr 16 19:54:11.948163 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:11.948132 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls podName:26e84dd9-5d40-41dd-95a3-95e087c04263 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:19.948112742 +0000 UTC m=+49.752299612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls") pod "image-registry-747fd5949f-wpqct" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263") : secret "image-registry-tls" not found Apr 16 19:54:12.048852 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:12.048814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:12.049029 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:12.048860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:12.049029 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:12.048910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:12.049029 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:12.048961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:12.049186 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:12.049036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:12.049186 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049070 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:20.049052488 +0000 UTC m=+49.853239356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:12.049186 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049129 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:12.049186 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049173 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls podName:c466d975-16d2-4ae9-8d08-159d0c6f360e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:20.049160869 +0000 UTC m=+49.853347719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls") pod "dns-default-q7wjg" (UID: "c466d975-16d2-4ae9-8d08-159d0c6f360e") : secret "dns-default-metrics-tls" not found Apr 16 19:54:12.049332 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049194 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:12.049332 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049221 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:12.049332 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049229 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls podName:d23d4c10-6686-46c3-bcb7-85ee9826dba3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:20.049218583 +0000 UTC m=+49.853405438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zlp6j" (UID: "d23d4c10-6686-46c3-bcb7-85ee9826dba3") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:12.049332 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049248 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert podName:175b0100-2c40-4ff1-993a-ed325cab1d64 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:20.04923998 +0000 UTC m=+49.853426833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert") pod "ingress-canary-9fsdq" (UID: "175b0100-2c40-4ff1-993a-ed325cab1d64") : secret "canary-serving-cert" not found Apr 16 19:54:12.049332 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049288 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:12.049332 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.049314 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:20.049305364 +0000 UTC m=+49.853492212 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : secret "router-metrics-certs-default" not found Apr 16 19:54:12.150110 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:12.150024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:12.150277 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.150200 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:12.150336 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:12.150279 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert podName:57623683-701c-481e-a07f-ba6e226f7785 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.150259915 +0000 UTC m=+42.954446767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vrm46" (UID: "57623683-701c-481e-a07f-ba6e226f7785") : secret "networking-console-plugin-cert" not found Apr 16 19:54:13.159281 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:13.159245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:13.159773 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:13.159420 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:13.159773 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:13.159522 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert podName:57623683-701c-481e-a07f-ba6e226f7785 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.159491692 +0000 UTC m=+44.963678541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vrm46" (UID: "57623683-701c-481e-a07f-ba6e226f7785") : secret "networking-console-plugin-cert" not found Apr 16 19:54:13.866247 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:13.866202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:54:13.868777 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:13.868720 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dfa5879b-9279-4460-929b-8800e9ce40bf-original-pull-secret\") pod \"global-pull-secret-syncer-hk2ln\" (UID: \"dfa5879b-9279-4460-929b-8800e9ce40bf\") " pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:54:13.924176 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:13.924144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hk2ln" Apr 16 19:54:15.180398 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:15.180362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:15.180782 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:15.180518 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:15.180782 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:15.180607 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert podName:57623683-701c-481e-a07f-ba6e226f7785 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:19.180588758 +0000 UTC m=+48.984775611 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vrm46" (UID: "57623683-701c-481e-a07f-ba6e226f7785") : secret "networking-console-plugin-cert" not found Apr 16 19:54:16.045002 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.043571 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97"] Apr 16 19:54:16.084402 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.082890 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hk2ln"] Apr 16 19:54:16.981990 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.981582 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" event={"ID":"631ca511-9b5c-4215-97a0-cd40a1f88ddd","Type":"ContainerStarted","Data":"59b685a8972d7955a5ecb483fdb24fa645e0bff354071fe68ecd13d4e2c0c368"} Apr 16 19:54:16.985501 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.984912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t672v" event={"ID":"c5bd5e90-47ac-41b5-bc0c-07feb9989bab","Type":"ContainerStarted","Data":"ec509a703d2040db31bbabce43d366c8d8d1e2ab55d0ea512492d4fc8cb48be4"} Apr 16 19:54:16.987962 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.987821 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" event={"ID":"52f124ce-6f9c-4329-9b94-708c5c6715f6","Type":"ContainerStarted","Data":"790f0f0ac74c7ff91eccff063f2a29079b1a05892e451343ee2bb60af8c98f25"} Apr 16 19:54:16.990201 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.989616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" event={"ID":"879830d9-0b32-47cc-bd4a-5600123b9e43","Type":"ContainerStarted","Data":"c67ecd25449dad11908c2f55104c2f1c8107b5638527d18818938f2884283264"} Apr 16 19:54:16.990201 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.990147 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:16.991328 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.991277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" event={"ID":"0ea503dc-2cac-4e5c-82ad-73e2f2babdb3","Type":"ContainerStarted","Data":"a2244e9d46b99669c4a76b4856c4a853f50bd73a421abc79be70ebbc01780093"} Apr 16 19:54:16.992189 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.992150 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" Apr 16 19:54:16.996606 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.995934 2574 generic.go:358] "Generic (PLEG): container finished" podID="cf3e916e-8fa0-480a-b696-be499c883f60" containerID="fe90a242c67ef992b70f801851514e102011bf3066bac8a4c98d52842e1be0f4" exitCode=0 Apr 16 19:54:16.996606 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.996004 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerDied","Data":"fe90a242c67ef992b70f801851514e102011bf3066bac8a4c98d52842e1be0f4"} Apr 16 19:54:16.998844 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:16.998388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" event={"ID":"76fbc99b-bd4e-4ed8-9580-4a1845bf152f","Type":"ContainerStarted","Data":"610b0da5ac61695659f4b784e8292e7ec4f779f921d7b2591ef60c230fc49b5f"} Apr 16 19:54:17.003033 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.002643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" event={"ID":"4f8ec4e6-16f1-43d3-a059-d546a6492815","Type":"ContainerStarted","Data":"28130f2533fb4195a81b6a4275740644e55ab5b30f00c08b2140393adb53e1ed"} Apr 16 19:54:17.003033 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.002624 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" podStartSLOduration=6.846462865 podStartE2EDuration="18.002608597s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.736693309 +0000 UTC m=+34.540880165" lastFinishedPulling="2026-04-16 19:54:15.892839036 +0000 UTC m=+45.697025897" observedRunningTime="2026-04-16 19:54:17.000087427 +0000 UTC m=+46.804274327" watchObservedRunningTime="2026-04-16 19:54:17.002608597 +0000 UTC m=+46.806795470" Apr 16 19:54:17.008024 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.007961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hk2ln" event={"ID":"dfa5879b-9279-4460-929b-8800e9ce40bf","Type":"ContainerStarted","Data":"b41e554cb383fa24903e533bda4c7487a0efe7113874a91943de78d465275f31"} Apr 16 19:54:17.010119 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.010063 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" event={"ID":"7663f14f-d056-4025-928a-0110e843ff4c","Type":"ContainerStarted","Data":"23c3f61086bbfd973f61786344f817d9a30ca1b0ac9eac112f59e6596f40d6f0"} Apr 16 19:54:17.014127 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.013356 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/0.log" Apr 16 19:54:17.014127 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.013395 2574 generic.go:358] "Generic (PLEG): container finished" podID="efaa6ad2-6fd5-4047-90af-f4b40a394f8f" containerID="e56a6f6c25de6a6b7a91d5cbc1e0e3f31ef99d06769e3f2db109252fe49f37ba" exitCode=255 Apr 16 19:54:17.014127 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.013441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" event={"ID":"efaa6ad2-6fd5-4047-90af-f4b40a394f8f","Type":"ContainerDied","Data":"e56a6f6c25de6a6b7a91d5cbc1e0e3f31ef99d06769e3f2db109252fe49f37ba"} Apr 16 19:54:17.014127 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.013807 2574 scope.go:117] "RemoveContainer" containerID="e56a6f6c25de6a6b7a91d5cbc1e0e3f31ef99d06769e3f2db109252fe49f37ba" Apr 16 19:54:17.074933 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.074830 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-t672v" podStartSLOduration=6.907613984 podStartE2EDuration="18.074809594s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.724532729 +0000 UTC m=+34.528719577" lastFinishedPulling="2026-04-16 19:54:15.89172831 +0000 UTC m=+45.695915187" observedRunningTime="2026-04-16 19:54:17.074285457 +0000 UTC m=+46.878472329" watchObservedRunningTime="2026-04-16 19:54:17.074809594 +0000 UTC m=+46.878996463" Apr 16 19:54:17.105450 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.104449 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-676775ffb5-dsbxm" podStartSLOduration=3.8870833769999997 podStartE2EDuration="15.104429125s" podCreationTimestamp="2026-04-16 19:54:02 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.742024878 +0000 UTC m=+34.546211741" lastFinishedPulling="2026-04-16 19:54:15.959370628 +0000 UTC m=+45.763557489" observedRunningTime="2026-04-16 19:54:17.102217573 +0000 UTC m=+46.906404445" watchObservedRunningTime="2026-04-16 19:54:17.104429125 +0000 UTC m=+46.908615998" Apr 16 19:54:17.132957 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.132906 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" podStartSLOduration=6.935983305 podStartE2EDuration="18.132886714s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.694823645 +0000 UTC m=+34.499010494" lastFinishedPulling="2026-04-16 19:54:15.891727042 +0000 UTC m=+45.695913903" observedRunningTime="2026-04-16 19:54:17.130902647 +0000 UTC m=+46.935089518" watchObservedRunningTime="2026-04-16 19:54:17.132886714 +0000 UTC m=+46.937073587" Apr 16 19:54:17.159402 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.159177 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69689b7f44-mvt59" podStartSLOduration=3.994315653 podStartE2EDuration="15.159156793s" podCreationTimestamp="2026-04-16 19:54:02 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.76952151 +0000 UTC m=+34.573708372" lastFinishedPulling="2026-04-16 19:54:15.934362645 +0000 UTC m=+45.738549512" observedRunningTime="2026-04-16 19:54:17.157541167 +0000 UTC m=+46.961728034" watchObservedRunningTime="2026-04-16 19:54:17.159156793 +0000 UTC m=+46.963343666" Apr 16 19:54:17.225651 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:17.225422 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kfft6" podStartSLOduration=6.997298864 podStartE2EDuration="18.225401583s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.654740231 +0000 UTC m=+34.458927094" lastFinishedPulling="2026-04-16 19:54:15.882842949 +0000 UTC m=+45.687029813" observedRunningTime="2026-04-16 19:54:17.186656977 +0000 UTC m=+46.990843849" watchObservedRunningTime="2026-04-16 19:54:17.225401583 +0000 UTC m=+47.029588454" Apr 16 19:54:18.024269 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.022110 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 19:54:18.025142 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.025117 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/0.log" Apr 16 19:54:18.025246 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.025162 2574 generic.go:358] "Generic (PLEG): container finished" podID="efaa6ad2-6fd5-4047-90af-f4b40a394f8f" containerID="7ade30d9f6408844c384228d1a241460769129f00b0a2af7a0a5bb23d026b32e" exitCode=255 Apr 16 19:54:18.025596 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.025577 2574 scope.go:117] "RemoveContainer" containerID="7ade30d9f6408844c384228d1a241460769129f00b0a2af7a0a5bb23d026b32e" Apr 16 19:54:18.026086 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.025836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" event={"ID":"efaa6ad2-6fd5-4047-90af-f4b40a394f8f","Type":"ContainerDied","Data":"7ade30d9f6408844c384228d1a241460769129f00b0a2af7a0a5bb23d026b32e"} Apr 16 19:54:18.026086 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.025891 2574 scope.go:117] "RemoveContainer" containerID="e56a6f6c25de6a6b7a91d5cbc1e0e3f31ef99d06769e3f2db109252fe49f37ba" Apr 16 19:54:18.026086 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:18.025895 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctpdp_openshift-console-operator(efaa6ad2-6fd5-4047-90af-f4b40a394f8f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" podUID="efaa6ad2-6fd5-4047-90af-f4b40a394f8f" Apr 16 19:54:18.031236 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.031212 2574 generic.go:358] "Generic (PLEG): container finished" podID="cf3e916e-8fa0-480a-b696-be499c883f60" containerID="031dd76172a9f87c702a09ebdc5f2748d2d85232cafc05f958a8c0da431082e7" exitCode=0 Apr 16 19:54:18.033646 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:18.032232 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerDied","Data":"031dd76172a9f87c702a09ebdc5f2748d2d85232cafc05f958a8c0da431082e7"} Apr 16 19:54:19.036690 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.035975 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 19:54:19.036690 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.036411 2574 scope.go:117] "RemoveContainer" containerID="7ade30d9f6408844c384228d1a241460769129f00b0a2af7a0a5bb23d026b32e" Apr 16 19:54:19.036690 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:19.036573 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctpdp_openshift-console-operator(efaa6ad2-6fd5-4047-90af-f4b40a394f8f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" podUID="efaa6ad2-6fd5-4047-90af-f4b40a394f8f" Apr 16 19:54:19.042602 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.042568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5srtm" event={"ID":"cf3e916e-8fa0-480a-b696-be499c883f60","Type":"ContainerStarted","Data":"3b34166b79e39bdb9627dc1efdc6c464d1252b841e04d712e5f7e9cfe70b997b"} Apr 16 19:54:19.084172 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.083897 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5srtm" podStartSLOduration=6.528783005 podStartE2EDuration="48.083875134s" podCreationTimestamp="2026-04-16 19:53:31 +0000 UTC" firstStartedPulling="2026-04-16 19:53:33.596969674 +0000 UTC m=+3.401156525" lastFinishedPulling="2026-04-16 19:54:15.152061801 +0000 UTC m=+44.956248654" observedRunningTime="2026-04-16 19:54:19.082647323 +0000 UTC m=+48.886834194" watchObservedRunningTime="2026-04-16 19:54:19.083875134 +0000 UTC m=+48.888062006" Apr 16 19:54:19.232653 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.232601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:19.232835 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:19.232792 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:19.232919 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:19.232878 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert podName:57623683-701c-481e-a07f-ba6e226f7785 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.232857976 +0000 UTC m=+57.037044829 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vrm46" (UID: "57623683-701c-481e-a07f-ba6e226f7785") : secret "networking-console-plugin-cert" not found Apr 16 19:54:19.859192 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.859155 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fc2mq"] Apr 16 19:54:19.890094 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.890056 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fc2mq"] Apr 16 19:54:19.890256 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.890196 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:19.892300 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.892272 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:54:19.892300 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.892293 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:54:19.892567 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.892552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8tkc8\"" Apr 16 19:54:19.940438 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.940395 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f24a8157-b359-4258-8496-a80f0514a050-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:19.940603 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.940537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:19.940603 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.940569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f24a8157-b359-4258-8496-a80f0514a050-data-volume\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:19.940603 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.940589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:19.940795 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.940616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrhv\" (UniqueName: \"kubernetes.io/projected/f24a8157-b359-4258-8496-a80f0514a050-kube-api-access-grrhv\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:19.940795 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:19.940710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f24a8157-b359-4258-8496-a80f0514a050-crio-socket\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:19.940795 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:19.940717 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:19.940902 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:19.940881 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls podName:417cddde-9bfc-4522-af0d-b947b69c5362 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:35.940859762 +0000 UTC m=+65.745046625 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gthr" (UID: "417cddde-9bfc-4522-af0d-b947b69c5362") : secret "samples-operator-tls" not found Apr 16 19:54:20.042188 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f24a8157-b359-4258-8496-a80f0514a050-data-volume\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grrhv\" (UniqueName: \"kubernetes.io/projected/f24a8157-b359-4258-8496-a80f0514a050-kube-api-access-grrhv\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f24a8157-b359-4258-8496-a80f0514a050-crio-socket\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f24a8157-b359-4258-8496-a80f0514a050-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f24a8157-b359-4258-8496-a80f0514a050-crio-socket\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.042388 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.042463 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.042477 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747fd5949f-wpqct: secret "image-registry-tls" not found Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.042542 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls podName:26e84dd9-5d40-41dd-95a3-95e087c04263 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.042522036 +0000 UTC m=+65.846708904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls") pod "image-registry-747fd5949f-wpqct" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263") : secret "image-registry-tls" not found Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.042551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f24a8157-b359-4258-8496-a80f0514a050-data-volume\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.042623 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.042624 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls podName:f24a8157-b359-4258-8496-a80f0514a050 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:20.542607143 +0000 UTC m=+50.346793997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fc2mq" (UID: "f24a8157-b359-4258-8496-a80f0514a050") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:20.043190 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.043008 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f24a8157-b359-4258-8496-a80f0514a050-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.050232 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.050205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrhv\" (UniqueName: \"kubernetes.io/projected/f24a8157-b359-4258-8496-a80f0514a050-kube-api-access-grrhv\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.143555 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.143466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:20.143555 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.143522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:20.143792 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143629 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:20.143792 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.143657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:20.143792 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143672 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.143650684 +0000 UTC m=+65.947837547 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:20.143792 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143710 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls podName:d23d4c10-6686-46c3-bcb7-85ee9826dba3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.143695012 +0000 UTC m=+65.947881869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zlp6j" (UID: "d23d4c10-6686-46c3-bcb7-85ee9826dba3") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:20.143792 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143785 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:20.144037 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.143804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:20.144037 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143895 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:20.144037 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143938 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs podName:4bbbd110-8877-408f-94d0-0ffb4ab8ed60 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.143920722 +0000 UTC m=+65.948107592 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs") pod "router-default-696c89d9db-bdmt2" (UID: "4bbbd110-8877-408f-94d0-0ffb4ab8ed60") : secret "router-metrics-certs-default" not found Apr 16 19:54:20.144037 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.143891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:20.144037 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143954 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:20.144037 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.143991 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls podName:c466d975-16d2-4ae9-8d08-159d0c6f360e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.143979626 +0000 UTC m=+65.948166495 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls") pod "dns-default-q7wjg" (UID: "c466d975-16d2-4ae9-8d08-159d0c6f360e") : secret "dns-default-metrics-tls" not found Apr 16 19:54:20.144037 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.144017 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert podName:175b0100-2c40-4ff1-993a-ed325cab1d64 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.144007016 +0000 UTC m=+65.948193865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert") pod "ingress-canary-9fsdq" (UID: "175b0100-2c40-4ff1-993a-ed325cab1d64") : secret "canary-serving-cert" not found Apr 16 19:54:20.547510 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:20.547416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:20.547674 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.547585 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:20.547674 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:20.547651 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls podName:f24a8157-b359-4258-8496-a80f0514a050 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:21.547634895 +0000 UTC m=+51.351821745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fc2mq" (UID: "f24a8157-b359-4258-8496-a80f0514a050") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:21.272107 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:21.272066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4gptc_67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca/dns-node-resolver/0.log" Apr 16 19:54:21.557533 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:21.557448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:21.557703 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:21.557590 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:21.557703 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:21.557659 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls podName:f24a8157-b359-4258-8496-a80f0514a050 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:23.557643667 +0000 UTC m=+53.361830518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fc2mq" (UID: "f24a8157-b359-4258-8496-a80f0514a050") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:22.472354 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:22.472301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l8zj8_a03880fb-0202-4f2e-9f09-4064525141cd/node-ca/0.log" Apr 16 19:54:23.055992 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.055950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" event={"ID":"52f124ce-6f9c-4329-9b94-708c5c6715f6","Type":"ContainerStarted","Data":"3d2426b77bf61563c78a68afada513c995a181f6105eebd0a6b8d69bf2c904d0"} Apr 16 19:54:23.056181 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.055996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" event={"ID":"52f124ce-6f9c-4329-9b94-708c5c6715f6","Type":"ContainerStarted","Data":"044b049c531b82c8d32a65fa88a346862e4fde1ce1a4a6c545c3ddad4b7886e5"} Apr 16 19:54:23.057960 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.057397 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hk2ln" event={"ID":"dfa5879b-9279-4460-929b-8800e9ce40bf","Type":"ContainerStarted","Data":"9825dd7a348f5c61412853bf3a9a92972570215603a11ae0ad0794a199b7c3f9"} Apr 16 19:54:23.058836 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.058813 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" event={"ID":"7663f14f-d056-4025-928a-0110e843ff4c","Type":"ContainerStarted","Data":"b69af1fa07bdf9d5cf8a96fed38d4c88bde8f52aa15207c693c53dfafad93903"} Apr 16 19:54:23.078947 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.078901 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" podStartSLOduration=3.593181821 podStartE2EDuration="21.078887571s" podCreationTimestamp="2026-04-16 19:54:02 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.797826199 +0000 UTC m=+34.602013047" lastFinishedPulling="2026-04-16 19:54:22.283531935 +0000 UTC m=+52.087718797" observedRunningTime="2026-04-16 19:54:23.077229638 +0000 UTC m=+52.881416509" watchObservedRunningTime="2026-04-16 19:54:23.078887571 +0000 UTC m=+52.883074437" Apr 16 19:54:23.095870 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.095817 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qqm97" podStartSLOduration=10.893825382 podStartE2EDuration="17.095802696s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.082108926 +0000 UTC m=+45.886295774" lastFinishedPulling="2026-04-16 19:54:22.284086222 +0000 UTC m=+52.088273088" observedRunningTime="2026-04-16 19:54:23.094501612 +0000 UTC m=+52.898688495" watchObservedRunningTime="2026-04-16 19:54:23.095802696 +0000 UTC m=+52.899989565" Apr 16 19:54:23.112872 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.112827 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hk2ln" podStartSLOduration=35.904252819999996 podStartE2EDuration="42.112812135s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.087267331 +0000 UTC m=+45.891454202" lastFinishedPulling="2026-04-16 19:54:22.295826658 +0000 UTC m=+52.100013517" observedRunningTime="2026-04-16 19:54:23.110779406 +0000 UTC m=+52.914966269" watchObservedRunningTime="2026-04-16 19:54:23.112812135 +0000 UTC m=+52.916999010" Apr 16 19:54:23.577695 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.577654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:23.578128 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:23.577826 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:23.578128 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:23.577900 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls podName:f24a8157-b359-4258-8496-a80f0514a050 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.577884307 +0000 UTC m=+57.382071159 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fc2mq" (UID: "f24a8157-b359-4258-8496-a80f0514a050") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:23.871275 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:23.871244 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nkcw5_631ca511-9b5c-4215-97a0-cd40a1f88ddd/kube-storage-version-migrator-operator/0.log" Apr 16 19:54:24.383673 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:24.383642 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:24.383840 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:24.383681 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:24.384082 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:24.384068 2574 scope.go:117] "RemoveContainer" containerID="7ade30d9f6408844c384228d1a241460769129f00b0a2af7a0a5bb23d026b32e" Apr 16 19:54:24.384264 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:24.384248 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctpdp_openshift-console-operator(efaa6ad2-6fd5-4047-90af-f4b40a394f8f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" podUID="efaa6ad2-6fd5-4047-90af-f4b40a394f8f" Apr 16 19:54:27.314372 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:27.314325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:27.314818 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:27.314477 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:27.314818 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:27.314544 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert podName:57623683-701c-481e-a07f-ba6e226f7785 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:43.314528763 +0000 UTC m=+73.118715611 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vrm46" (UID: "57623683-701c-481e-a07f-ba6e226f7785") : secret "networking-console-plugin-cert" not found Apr 16 19:54:27.616952 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:27.616913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:27.617169 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:27.617079 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:27.617169 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:54:27.617161 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls podName:f24a8157-b359-4258-8496-a80f0514a050 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:35.617140993 +0000 UTC m=+65.421327859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fc2mq" (UID: "f24a8157-b359-4258-8496-a80f0514a050") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:29.912677 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:29.912647 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-726jz" Apr 16 19:54:35.691714 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:35.691679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:35.693933 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:35.693914 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f24a8157-b359-4258-8496-a80f0514a050-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fc2mq\" (UID: \"f24a8157-b359-4258-8496-a80f0514a050\") " pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:35.807044 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:35.807017 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8tkc8\"" Apr 16 19:54:35.815970 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:35.815949 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fc2mq" Apr 16 19:54:35.949691 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:35.949615 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fc2mq"] Apr 16 19:54:35.952934 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:35.952896 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24a8157_b359_4258_8496_a80f0514a050.slice/crio-7d801986fb2c32dcfea20516a4d9d21c75037c9299661fb21e34e784a1308b45 WatchSource:0}: Error finding container 7d801986fb2c32dcfea20516a4d9d21c75037c9299661fb21e34e784a1308b45: Status 404 returned error can't find the container with id 7d801986fb2c32dcfea20516a4d9d21c75037c9299661fb21e34e784a1308b45 Apr 16 19:54:35.995374 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:35.995350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:35.997577 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:35.997554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/417cddde-9bfc-4522-af0d-b947b69c5362-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gthr\" (UID: \"417cddde-9bfc-4522-af0d-b947b69c5362\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:36.095888 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.095852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:36.097433 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.097403 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fc2mq" event={"ID":"f24a8157-b359-4258-8496-a80f0514a050","Type":"ContainerStarted","Data":"951c50115e9989493348cda3f5e89615def6852ec65bb06b04c4ac0b5482f6c9"} Apr 16 19:54:36.097433 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.097438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fc2mq" event={"ID":"f24a8157-b359-4258-8496-a80f0514a050","Type":"ContainerStarted","Data":"7d801986fb2c32dcfea20516a4d9d21c75037c9299661fb21e34e784a1308b45"} Apr 16 19:54:36.098372 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.098348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"image-registry-747fd5949f-wpqct\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:36.156540 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.156504 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-z8vjn\"" Apr 16 19:54:36.165269 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.165247 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" Apr 16 19:54:36.196998 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.196969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:36.197171 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.197032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:36.197171 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.197088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:36.197171 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.197105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:36.197339 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.197216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:36.197794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.197743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-service-ca-bundle\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:36.199411 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.199376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bbbd110-8877-408f-94d0-0ffb4ab8ed60-metrics-certs\") pod \"router-default-696c89d9db-bdmt2\" (UID: \"4bbbd110-8877-408f-94d0-0ffb4ab8ed60\") " pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:36.199837 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.199780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c466d975-16d2-4ae9-8d08-159d0c6f360e-metrics-tls\") pod \"dns-default-q7wjg\" (UID: \"c466d975-16d2-4ae9-8d08-159d0c6f360e\") " pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:36.199904 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.199850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175b0100-2c40-4ff1-993a-ed325cab1d64-cert\") pod \"ingress-canary-9fsdq\" (UID: \"175b0100-2c40-4ff1-993a-ed325cab1d64\") " pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:36.200122 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.200100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d23d4c10-6686-46c3-bcb7-85ee9826dba3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zlp6j\" (UID: \"d23d4c10-6686-46c3-bcb7-85ee9826dba3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:36.234968 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.234943 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q97t2\"" Apr 16 19:54:36.244019 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.243991 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:36.284579 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.284546 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr"] Apr 16 19:54:36.290074 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.290047 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-sdtlb\"" Apr 16 19:54:36.298342 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.298289 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" Apr 16 19:54:36.325039 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.324854 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-fg4fh\"" Apr 16 19:54:36.331846 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.331828 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:36.339013 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.338994 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7n2jf\"" Apr 16 19:54:36.347944 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.347909 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9fsdq" Apr 16 19:54:36.376858 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.376627 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747fd5949f-wpqct"] Apr 16 19:54:36.382641 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:36.382443 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e84dd9_5d40_41dd_95a3_95e087c04263.slice/crio-404052a1d6f976cac5ee0323df44f231ed2900207d862e069084ac6b7781d7a2 WatchSource:0}: Error finding container 404052a1d6f976cac5ee0323df44f231ed2900207d862e069084ac6b7781d7a2: Status 404 returned error can't find the container with id 404052a1d6f976cac5ee0323df44f231ed2900207d862e069084ac6b7781d7a2 Apr 16 19:54:36.387607 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.387379 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8flcv\"" Apr 16 19:54:36.396263 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.396228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:36.450043 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.449975 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j"] Apr 16 19:54:36.452865 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:36.452834 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd23d4c10_6686_46c3_bcb7_85ee9826dba3.slice/crio-5ed911c557f83aab1d24a5b7a2ca14136c3617bd791a89c138bf2987535d6ed1 WatchSource:0}: Error finding container 5ed911c557f83aab1d24a5b7a2ca14136c3617bd791a89c138bf2987535d6ed1: Status 404 returned error can't find the container with id 5ed911c557f83aab1d24a5b7a2ca14136c3617bd791a89c138bf2987535d6ed1 Apr 16 19:54:36.500794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.500723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:54:36.503859 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.503191 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:36.505438 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.504976 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9fsdq"] Apr 16 19:54:36.508155 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:36.508126 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175b0100_2c40_4ff1_993a_ed325cab1d64.slice/crio-1f395ce62de490acbf5bec062aef838d1722b615eec29189533cd6b0ba1de400 WatchSource:0}: Error finding container 1f395ce62de490acbf5bec062aef838d1722b615eec29189533cd6b0ba1de400: Status 404 returned error can't find the container with id 1f395ce62de490acbf5bec062aef838d1722b615eec29189533cd6b0ba1de400 Apr 16 19:54:36.515863 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.515839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884798a9-cac3-41a4-af20-f3c01d50646e-metrics-certs\") pod \"network-metrics-daemon-bdzq7\" (UID: \"884798a9-cac3-41a4-af20-f3c01d50646e\") " pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:54:36.525310 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.525275 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-696c89d9db-bdmt2"] Apr 16 19:54:36.528830 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:36.528789 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bbbd110_8877_408f_94d0_0ffb4ab8ed60.slice/crio-19f94f87430fa08362a20102c89aa93a4b5f03643639feef04e167e648140fcc WatchSource:0}: Error finding container 19f94f87430fa08362a20102c89aa93a4b5f03643639feef04e167e648140fcc: Status 404 returned error can't find the container with id 19f94f87430fa08362a20102c89aa93a4b5f03643639feef04e167e648140fcc Apr 16 19:54:36.572865 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.572831 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q7wjg"] Apr 16 19:54:36.601978 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.601947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:36.604895 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.604862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2zqd\" (UniqueName: \"kubernetes.io/projected/fe26e572-de88-4d80-bc05-a05fc220448c-kube-api-access-v2zqd\") pod \"network-check-target-56m94\" (UID: \"fe26e572-de88-4d80-bc05-a05fc220448c\") " pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:36.659705 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:36.659669 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc466d975_16d2_4ae9_8d08_159d0c6f360e.slice/crio-17bbabc8da91630827037f442144f16e2dcb3fd03d2e6684e143c6cd4a89419b WatchSource:0}: Error finding container 17bbabc8da91630827037f442144f16e2dcb3fd03d2e6684e143c6cd4a89419b: Status 404 returned error can't find the container with id 17bbabc8da91630827037f442144f16e2dcb3fd03d2e6684e143c6cd4a89419b Apr 16 19:54:36.712921 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.712898 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2xst7\"" Apr 16 19:54:36.721522 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.721494 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdzq7" Apr 16 19:54:36.735844 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.735806 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p7kgs\"" Apr 16 19:54:36.744503 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.744477 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:36.873733 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.873702 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bdzq7"] Apr 16 19:54:36.877237 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:36.877208 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884798a9_cac3_41a4_af20_f3c01d50646e.slice/crio-4721f66ed6796db9c8ec025e02a3cdcdd2b772023a6f6d1643c7fea722a15181 WatchSource:0}: Error finding container 4721f66ed6796db9c8ec025e02a3cdcdd2b772023a6f6d1643c7fea722a15181: Status 404 returned error can't find the container with id 4721f66ed6796db9c8ec025e02a3cdcdd2b772023a6f6d1643c7fea722a15181 Apr 16 19:54:36.893268 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:36.893239 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-56m94"] Apr 16 19:54:36.897710 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:36.897678 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe26e572_de88_4d80_bc05_a05fc220448c.slice/crio-2e3ab6d0706144cab2791cfb7f09b94eb45c25dc18912f2b80f10b606de159af WatchSource:0}: Error finding container 2e3ab6d0706144cab2791cfb7f09b94eb45c25dc18912f2b80f10b606de159af: Status 404 returned error can't find the container with id 2e3ab6d0706144cab2791cfb7f09b94eb45c25dc18912f2b80f10b606de159af Apr 16 19:54:37.103318 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.103276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-696c89d9db-bdmt2" event={"ID":"4bbbd110-8877-408f-94d0-0ffb4ab8ed60","Type":"ContainerStarted","Data":"b84a1cc5ef3ba1bfc830e2437bae3220ee5a529eaf4e48da27c975aee840f8db"} Apr 16 19:54:37.103318 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.103335 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-696c89d9db-bdmt2" event={"ID":"4bbbd110-8877-408f-94d0-0ffb4ab8ed60","Type":"ContainerStarted","Data":"19f94f87430fa08362a20102c89aa93a4b5f03643639feef04e167e648140fcc"} Apr 16 19:54:37.105068 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.105024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" event={"ID":"417cddde-9bfc-4522-af0d-b947b69c5362","Type":"ContainerStarted","Data":"c1191eacf1e08188c5eeedfe2d8378f9ca85c3219958149fc7ccdf52b6a4bc34"} Apr 16 19:54:37.107116 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.106889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bdzq7" event={"ID":"884798a9-cac3-41a4-af20-f3c01d50646e","Type":"ContainerStarted","Data":"4721f66ed6796db9c8ec025e02a3cdcdd2b772023a6f6d1643c7fea722a15181"} Apr 16 19:54:37.108852 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.108817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q7wjg" event={"ID":"c466d975-16d2-4ae9-8d08-159d0c6f360e","Type":"ContainerStarted","Data":"17bbabc8da91630827037f442144f16e2dcb3fd03d2e6684e143c6cd4a89419b"} Apr 16 19:54:37.110347 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.110324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" event={"ID":"d23d4c10-6686-46c3-bcb7-85ee9826dba3","Type":"ContainerStarted","Data":"5ed911c557f83aab1d24a5b7a2ca14136c3617bd791a89c138bf2987535d6ed1"} Apr 16 19:54:37.112174 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.112150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9fsdq" event={"ID":"175b0100-2c40-4ff1-993a-ed325cab1d64","Type":"ContainerStarted","Data":"1f395ce62de490acbf5bec062aef838d1722b615eec29189533cd6b0ba1de400"} Apr 16 19:54:37.113944 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.113897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fc2mq" event={"ID":"f24a8157-b359-4258-8496-a80f0514a050","Type":"ContainerStarted","Data":"acfbef4851eadead4fcb78b1c86c5a5101f965bf992ebd2b38493a8041d42e1d"} Apr 16 19:54:37.115746 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.115720 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" event={"ID":"26e84dd9-5d40-41dd-95a3-95e087c04263","Type":"ContainerStarted","Data":"4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013"} Apr 16 19:54:37.115852 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.115750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" event={"ID":"26e84dd9-5d40-41dd-95a3-95e087c04263","Type":"ContainerStarted","Data":"404052a1d6f976cac5ee0323df44f231ed2900207d862e069084ac6b7781d7a2"} Apr 16 19:54:37.116406 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.116373 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:54:37.118226 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.118178 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-56m94" event={"ID":"fe26e572-de88-4d80-bc05-a05fc220448c","Type":"ContainerStarted","Data":"85e33819c7e1d2fa1d929edd19be4416d5dffb3e76eadee2da7ca5dcb628565d"} Apr 16 19:54:37.118226 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.118205 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-56m94" event={"ID":"fe26e572-de88-4d80-bc05-a05fc220448c","Type":"ContainerStarted","Data":"2e3ab6d0706144cab2791cfb7f09b94eb45c25dc18912f2b80f10b606de159af"} Apr 16 19:54:37.118555 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.118496 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:54:37.138806 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.137410 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-696c89d9db-bdmt2" podStartSLOduration=38.137390412 podStartE2EDuration="38.137390412s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:37.122431778 +0000 UTC m=+66.926618650" watchObservedRunningTime="2026-04-16 19:54:37.137390412 +0000 UTC m=+66.941577285" Apr 16 19:54:37.159544 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.158088 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" podStartSLOduration=54.158068672 podStartE2EDuration="54.158068672s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:37.157393067 +0000 UTC m=+66.961579937" watchObservedRunningTime="2026-04-16 19:54:37.158068672 +0000 UTC m=+66.962255573" Apr 16 19:54:37.159544 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.159235 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-56m94" podStartSLOduration=66.159223772 podStartE2EDuration="1m6.159223772s" podCreationTimestamp="2026-04-16 19:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:37.139118947 +0000 UTC m=+66.943305817" watchObservedRunningTime="2026-04-16 19:54:37.159223772 +0000 UTC m=+66.963410643" Apr 16 19:54:37.332271 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.332214 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:37.335294 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:37.335099 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:38.123446 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:38.123417 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:38.125137 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:38.124943 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-696c89d9db-bdmt2" Apr 16 19:54:39.790240 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:39.790208 2574 scope.go:117] "RemoveContainer" containerID="7ade30d9f6408844c384228d1a241460769129f00b0a2af7a0a5bb23d026b32e" Apr 16 19:54:41.134178 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.134145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" event={"ID":"417cddde-9bfc-4522-af0d-b947b69c5362","Type":"ContainerStarted","Data":"71158c94f3ef80a6909affda18a15d7000d8d737a328b7c7d75b030148b63952"} Apr 16 19:54:41.134631 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.134186 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" event={"ID":"417cddde-9bfc-4522-af0d-b947b69c5362","Type":"ContainerStarted","Data":"cdb274ea170c7f73f83ea51c61ba31b8cfb4097403b1622cd60df812a3b579d1"} Apr 16 19:54:41.135904 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.135878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bdzq7" event={"ID":"884798a9-cac3-41a4-af20-f3c01d50646e","Type":"ContainerStarted","Data":"25a3655e6c40cb7527dcf704691ca953cca72852d6c617e9d13efff943146e0a"} Apr 16 19:54:41.136028 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.135911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bdzq7" event={"ID":"884798a9-cac3-41a4-af20-f3c01d50646e","Type":"ContainerStarted","Data":"6c45b15be4e37cf78afbb9a5ff9eb8a94b5b6e75f2aa40a0decba6a11efababd"} Apr 16 19:54:41.137517 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.137493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q7wjg" event={"ID":"c466d975-16d2-4ae9-8d08-159d0c6f360e","Type":"ContainerStarted","Data":"901200384b3a6b986e7f1f6b6c148cf50c9c6ba6dd4384383c9f0afa4c955eb5"} Apr 16 19:54:41.137613 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.137526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q7wjg" event={"ID":"c466d975-16d2-4ae9-8d08-159d0c6f360e","Type":"ContainerStarted","Data":"62befeabbecf12f10310c698e3ba79f1c6994c0dba8df253abbc74f50aa86b6e"} Apr 16 19:54:41.137683 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.137625 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:41.138932 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.138911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" event={"ID":"d23d4c10-6686-46c3-bcb7-85ee9826dba3","Type":"ContainerStarted","Data":"2840fd55a3a59b36e7322e76e5a00bd3eb1eee8755032aa09e899d703e8ba44b"} Apr 16 19:54:41.140307 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.140286 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9fsdq" event={"ID":"175b0100-2c40-4ff1-993a-ed325cab1d64","Type":"ContainerStarted","Data":"efb2f82f490fa0fc41daaf4cfdad325254fb3d9fe15db64ec4408e5065337a5c"} Apr 16 19:54:41.141855 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.141837 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 19:54:41.141944 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.141929 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" event={"ID":"efaa6ad2-6fd5-4047-90af-f4b40a394f8f","Type":"ContainerStarted","Data":"2ac08ba7cdbc70c2b0cab5299dd07c2083b5d8acc2688c39f84f5113bfe5803c"} Apr 16 19:54:41.142240 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.142206 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:41.143833 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.143804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fc2mq" event={"ID":"f24a8157-b359-4258-8496-a80f0514a050","Type":"ContainerStarted","Data":"56f31ef957221c956a0807ae14eb0f0ab424a5729143053acae4e8427359ef3d"} Apr 16 19:54:41.155983 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.155903 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gthr" podStartSLOduration=39.027679451 podStartE2EDuration="43.155890821s" podCreationTimestamp="2026-04-16 19:53:58 +0000 UTC" firstStartedPulling="2026-04-16 19:54:36.331732231 +0000 UTC m=+66.135919080" lastFinishedPulling="2026-04-16 19:54:40.45994359 +0000 UTC m=+70.264130450" observedRunningTime="2026-04-16 19:54:41.155057714 +0000 UTC m=+70.959244595" watchObservedRunningTime="2026-04-16 19:54:41.155890821 +0000 UTC m=+70.960077690" Apr 16 19:54:41.181544 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.181502 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bdzq7" podStartSLOduration=67.600318722 podStartE2EDuration="1m11.181490072s" podCreationTimestamp="2026-04-16 19:53:30 +0000 UTC" firstStartedPulling="2026-04-16 19:54:36.879194513 +0000 UTC m=+66.683381360" lastFinishedPulling="2026-04-16 19:54:40.460365843 +0000 UTC m=+70.264552710" observedRunningTime="2026-04-16 19:54:41.179169526 +0000 UTC m=+70.983356407" watchObservedRunningTime="2026-04-16 19:54:41.181490072 +0000 UTC m=+70.985676941" Apr 16 19:54:41.202073 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.202024 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zlp6j" podStartSLOduration=38.192683213 podStartE2EDuration="42.202007435s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:36.454956455 +0000 UTC m=+66.259143308" lastFinishedPulling="2026-04-16 19:54:40.464280679 +0000 UTC m=+70.268467530" observedRunningTime="2026-04-16 19:54:41.200584049 +0000 UTC m=+71.004770919" watchObservedRunningTime="2026-04-16 19:54:41.202007435 +0000 UTC m=+71.006194308" Apr 16 19:54:41.220591 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.220494 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fc2mq" podStartSLOduration=17.771090259 podStartE2EDuration="22.220477109s" podCreationTimestamp="2026-04-16 19:54:19 +0000 UTC" firstStartedPulling="2026-04-16 19:54:36.011097965 +0000 UTC m=+65.815284812" lastFinishedPulling="2026-04-16 19:54:40.460484807 +0000 UTC m=+70.264671662" observedRunningTime="2026-04-16 19:54:41.220037007 +0000 UTC m=+71.024223880" watchObservedRunningTime="2026-04-16 19:54:41.220477109 +0000 UTC m=+71.024663980" Apr 16 19:54:41.238215 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.238156 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" podStartSLOduration=31.021375412 podStartE2EDuration="42.238140085s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:04.666082158 +0000 UTC m=+34.470269024" lastFinishedPulling="2026-04-16 19:54:15.882846838 +0000 UTC m=+45.687033697" observedRunningTime="2026-04-16 19:54:41.236303969 +0000 UTC m=+71.040490850" watchObservedRunningTime="2026-04-16 19:54:41.238140085 +0000 UTC m=+71.042326957" Apr 16 19:54:41.256734 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.256676 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9fsdq" podStartSLOduration=33.307653694 podStartE2EDuration="37.256658603s" podCreationTimestamp="2026-04-16 19:54:04 +0000 UTC" firstStartedPulling="2026-04-16 19:54:36.511817536 +0000 UTC m=+66.316004402" lastFinishedPulling="2026-04-16 19:54:40.460822463 +0000 UTC m=+70.265009311" observedRunningTime="2026-04-16 19:54:41.255734732 +0000 UTC m=+71.059921603" watchObservedRunningTime="2026-04-16 19:54:41.256658603 +0000 UTC m=+71.060845476" Apr 16 19:54:41.273998 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.273950 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-q7wjg" podStartSLOduration=33.475047254 podStartE2EDuration="37.273934251s" podCreationTimestamp="2026-04-16 19:54:04 +0000 UTC" firstStartedPulling="2026-04-16 19:54:36.661480398 +0000 UTC m=+66.465667246" lastFinishedPulling="2026-04-16 19:54:40.460367395 +0000 UTC m=+70.264554243" observedRunningTime="2026-04-16 19:54:41.271718073 +0000 UTC m=+71.075904942" watchObservedRunningTime="2026-04-16 19:54:41.273934251 +0000 UTC m=+71.078121122" Apr 16 19:54:41.362102 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:41.362070 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctpdp" Apr 16 19:54:43.369110 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:43.369071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:43.371595 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:43.371559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/57623683-701c-481e-a07f-ba6e226f7785-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vrm46\" (UID: \"57623683-701c-481e-a07f-ba6e226f7785\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:43.560276 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:43.560242 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fd9zf\"" Apr 16 19:54:43.568724 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:43.568703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" Apr 16 19:54:43.687004 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:43.686979 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vrm46"] Apr 16 19:54:43.689618 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:43.689588 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57623683_701c_481e_a07f_ba6e226f7785.slice/crio-06eb5d16b3061dcd47d013e0dc4eff1917d1aa95627f22065c79a4e2aa6b5ed9 WatchSource:0}: Error finding container 06eb5d16b3061dcd47d013e0dc4eff1917d1aa95627f22065c79a4e2aa6b5ed9: Status 404 returned error can't find the container with id 06eb5d16b3061dcd47d013e0dc4eff1917d1aa95627f22065c79a4e2aa6b5ed9 Apr 16 19:54:44.152646 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:44.152607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" event={"ID":"57623683-701c-481e-a07f-ba6e226f7785","Type":"ContainerStarted","Data":"06eb5d16b3061dcd47d013e0dc4eff1917d1aa95627f22065c79a4e2aa6b5ed9"} Apr 16 19:54:45.157832 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:45.157794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" event={"ID":"57623683-701c-481e-a07f-ba6e226f7785","Type":"ContainerStarted","Data":"eb7281d2d151bd26b7620c1612f82c1b3c97b38ed480ece61c5d905a61fcb450"} Apr 16 19:54:45.211815 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:45.211748 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vrm46" podStartSLOduration=33.238012746 podStartE2EDuration="34.211730982s" podCreationTimestamp="2026-04-16 19:54:11 +0000 UTC" firstStartedPulling="2026-04-16 19:54:43.691598124 +0000 UTC m=+73.495784980" lastFinishedPulling="2026-04-16 19:54:44.665316366 +0000 UTC m=+74.469503216" observedRunningTime="2026-04-16 19:54:45.209657878 +0000 UTC m=+75.013844748" watchObservedRunningTime="2026-04-16 19:54:45.211730982 +0000 UTC m=+75.015917894" Apr 16 19:54:51.149931 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:51.149895 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-q7wjg" Apr 16 19:54:54.733247 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.733213 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-67f5h"] Apr 16 19:54:54.763434 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.763404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.770041 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.769674 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rv4cw\"" Apr 16 19:54:54.770041 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.769695 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:54:54.770299 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.770280 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:54:54.770371 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.770303 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:54:54.770371 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.770307 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:54:54.852746 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.852698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-root\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.852746 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.852742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbh87\" (UniqueName: \"kubernetes.io/projected/8445b411-7903-43fc-9635-1a16478893cc-kube-api-access-fbh87\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.853004 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.852811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-sys\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.853004 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.852869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8445b411-7903-43fc-9635-1a16478893cc-metrics-client-ca\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.853004 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.852893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-tls\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.853004 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.852987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-wtmp\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.853139 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.853018 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.853139 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.853037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-textfile\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.853139 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.853055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953503 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8445b411-7903-43fc-9635-1a16478893cc-metrics-client-ca\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953503 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-tls\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-wtmp\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-textfile\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-root\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-wtmp\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.953794 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbh87\" (UniqueName: \"kubernetes.io/projected/8445b411-7903-43fc-9635-1a16478893cc-kube-api-access-fbh87\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.954146 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-sys\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.954146 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-root\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.954146 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.953861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8445b411-7903-43fc-9635-1a16478893cc-sys\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.954269 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.954164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8445b411-7903-43fc-9635-1a16478893cc-metrics-client-ca\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.954269 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.954243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.954341 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.954326 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-textfile\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.956023 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.956005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.956122 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.956094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8445b411-7903-43fc-9635-1a16478893cc-node-exporter-tls\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:54.961005 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:54.960984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbh87\" (UniqueName: \"kubernetes.io/projected/8445b411-7903-43fc-9635-1a16478893cc-kube-api-access-fbh87\") pod \"node-exporter-67f5h\" (UID: \"8445b411-7903-43fc-9635-1a16478893cc\") " pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:55.073000 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:55.072903 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-67f5h" Apr 16 19:54:55.082869 ip-10-0-137-239 kubenswrapper[2574]: W0416 19:54:55.082834 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8445b411_7903_43fc_9635_1a16478893cc.slice/crio-5061b5440ff18e5fde6b03a963c65975688d39b1d7fbc2c27f0d6c4255018a71 WatchSource:0}: Error finding container 5061b5440ff18e5fde6b03a963c65975688d39b1d7fbc2c27f0d6c4255018a71: Status 404 returned error can't find the container with id 5061b5440ff18e5fde6b03a963c65975688d39b1d7fbc2c27f0d6c4255018a71 Apr 16 19:54:55.186243 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:55.186207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67f5h" event={"ID":"8445b411-7903-43fc-9635-1a16478893cc","Type":"ContainerStarted","Data":"5061b5440ff18e5fde6b03a963c65975688d39b1d7fbc2c27f0d6c4255018a71"} Apr 16 19:54:57.193444 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:57.193402 2574 generic.go:358] "Generic (PLEG): container finished" podID="8445b411-7903-43fc-9635-1a16478893cc" containerID="038ae11578aa5a6f19ef0509098318e3689272b6e8f90fbd2c22ff012a9ab61a" exitCode=0 Apr 16 19:54:57.193845 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:57.193466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67f5h" event={"ID":"8445b411-7903-43fc-9635-1a16478893cc","Type":"ContainerDied","Data":"038ae11578aa5a6f19ef0509098318e3689272b6e8f90fbd2c22ff012a9ab61a"} Apr 16 19:54:58.197932 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:58.197888 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67f5h" event={"ID":"8445b411-7903-43fc-9635-1a16478893cc","Type":"ContainerStarted","Data":"56396897d8bf92edead9b2453d2d897cdb07bac8e2b9d6bef9d34e51627fa332"} Apr 16 19:54:58.197932 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:58.197930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67f5h" event={"ID":"8445b411-7903-43fc-9635-1a16478893cc","Type":"ContainerStarted","Data":"26bca2ed1282624080e4dfc1be845eeb890c419529311b9f35de92bc3ae1a817"} Apr 16 19:54:58.221509 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:58.221449 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-67f5h" podStartSLOduration=2.948712667 podStartE2EDuration="4.221434238s" podCreationTimestamp="2026-04-16 19:54:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:55.084952118 +0000 UTC m=+84.889138981" lastFinishedPulling="2026-04-16 19:54:56.357673691 +0000 UTC m=+86.161860552" observedRunningTime="2026-04-16 19:54:58.219606357 +0000 UTC m=+88.023793227" watchObservedRunningTime="2026-04-16 19:54:58.221434238 +0000 UTC m=+88.025621135" Apr 16 19:54:59.130042 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:54:59.130014 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:55:07.213936 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:07.213903 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747fd5949f-wpqct"] Apr 16 19:55:08.126537 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:08.126505 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-56m94" Apr 16 19:55:23.270528 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:23.270491 2574 generic.go:358] "Generic (PLEG): container finished" podID="c5bd5e90-47ac-41b5-bc0c-07feb9989bab" containerID="ec509a703d2040db31bbabce43d366c8d8d1e2ab55d0ea512492d4fc8cb48be4" exitCode=0 Apr 16 19:55:23.270952 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:23.270567 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t672v" event={"ID":"c5bd5e90-47ac-41b5-bc0c-07feb9989bab","Type":"ContainerDied","Data":"ec509a703d2040db31bbabce43d366c8d8d1e2ab55d0ea512492d4fc8cb48be4"} Apr 16 19:55:23.270952 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:23.270914 2574 scope.go:117] "RemoveContainer" containerID="ec509a703d2040db31bbabce43d366c8d8d1e2ab55d0ea512492d4fc8cb48be4" Apr 16 19:55:24.279996 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:24.279961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t672v" event={"ID":"c5bd5e90-47ac-41b5-bc0c-07feb9989bab","Type":"ContainerStarted","Data":"8f859011a24704a485f93dc256c396e463eeb905e04807f806cd242e1d58cae7"} Apr 16 19:55:32.238741 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.238679 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" podUID="26e84dd9-5d40-41dd-95a3-95e087c04263" containerName="registry" containerID="cri-o://4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013" gracePeriod=30 Apr 16 19:55:32.484140 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.484118 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:55:32.572958 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.572858 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wppdd\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-kube-api-access-wppdd\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.572958 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.572934 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-image-registry-private-configuration\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.573182 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.572961 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26e84dd9-5d40-41dd-95a3-95e087c04263-ca-trust-extracted\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.573182 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.572981 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-installation-pull-secrets\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.573182 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.573022 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-bound-sa-token\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.573182 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.573056 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-certificates\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.573182 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.573084 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-trusted-ca\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.573182 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.573116 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") pod \"26e84dd9-5d40-41dd-95a3-95e087c04263\" (UID: \"26e84dd9-5d40-41dd-95a3-95e087c04263\") " Apr 16 19:55:32.573572 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.573539 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:32.573636 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.573612 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:32.575483 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.575455 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:32.575609 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.575476 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:32.575900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.575879 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:32.575900 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.575884 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:32.576093 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.576076 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-kube-api-access-wppdd" (OuterVolumeSpecName: "kube-api-access-wppdd") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "kube-api-access-wppdd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:32.581974 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.581947 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e84dd9-5d40-41dd-95a3-95e087c04263-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "26e84dd9-5d40-41dd-95a3-95e087c04263" (UID: "26e84dd9-5d40-41dd-95a3-95e087c04263"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:55:32.673932 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673896 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wppdd\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-kube-api-access-wppdd\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:32.673932 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673929 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-image-registry-private-configuration\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:32.673932 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673940 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26e84dd9-5d40-41dd-95a3-95e087c04263-ca-trust-extracted\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:32.674164 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673949 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26e84dd9-5d40-41dd-95a3-95e087c04263-installation-pull-secrets\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:32.674164 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673959 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-bound-sa-token\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:32.674164 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673968 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-certificates\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:32.674164 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673980 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e84dd9-5d40-41dd-95a3-95e087c04263-trusted-ca\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:32.674164 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:32.673989 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26e84dd9-5d40-41dd-95a3-95e087c04263-registry-tls\") on node \"ip-10-0-137-239.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.307066 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.307027 2574 generic.go:358] "Generic (PLEG): container finished" podID="26e84dd9-5d40-41dd-95a3-95e087c04263" containerID="4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013" exitCode=0 Apr 16 19:55:33.307066 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.307072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" event={"ID":"26e84dd9-5d40-41dd-95a3-95e087c04263","Type":"ContainerDied","Data":"4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013"} Apr 16 19:55:33.307565 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.307085 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" Apr 16 19:55:33.307565 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.307095 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747fd5949f-wpqct" event={"ID":"26e84dd9-5d40-41dd-95a3-95e087c04263","Type":"ContainerDied","Data":"404052a1d6f976cac5ee0323df44f231ed2900207d862e069084ac6b7781d7a2"} Apr 16 19:55:33.307565 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.307118 2574 scope.go:117] "RemoveContainer" containerID="4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013" Apr 16 19:55:33.315152 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.315129 2574 scope.go:117] "RemoveContainer" containerID="4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013" Apr 16 19:55:33.315423 ip-10-0-137-239 kubenswrapper[2574]: E0416 19:55:33.315401 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013\": container with ID starting with 4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013 not found: ID does not exist" containerID="4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013" Apr 16 19:55:33.315493 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.315436 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013"} err="failed to get container status \"4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013\": rpc error: code = NotFound desc = could not find container \"4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013\": container with ID starting with 4b0bd084f2cf9f39ebede095984fc981d9e0f7e52ba498be2764d82a55996013 not found: ID does not exist" Apr 16 19:55:33.324862 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.324839 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747fd5949f-wpqct"] Apr 16 19:55:33.328461 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:33.328440 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-747fd5949f-wpqct"] Apr 16 19:55:34.793718 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:34.793681 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e84dd9-5d40-41dd-95a3-95e087c04263" path="/var/lib/kubelet/pods/26e84dd9-5d40-41dd-95a3-95e087c04263/volumes" Apr 16 19:55:44.551041 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:44.550983 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" podUID="52f124ce-6f9c-4329-9b94-708c5c6715f6" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 19:55:47.346671 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:47.346639 2574 generic.go:358] "Generic (PLEG): container finished" podID="76fbc99b-bd4e-4ed8-9580-4a1845bf152f" containerID="610b0da5ac61695659f4b784e8292e7ec4f779f921d7b2591ef60c230fc49b5f" exitCode=0 Apr 16 19:55:47.347127 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:47.346716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" event={"ID":"76fbc99b-bd4e-4ed8-9580-4a1845bf152f","Type":"ContainerDied","Data":"610b0da5ac61695659f4b784e8292e7ec4f779f921d7b2591ef60c230fc49b5f"} Apr 16 19:55:47.347127 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:47.347079 2574 scope.go:117] "RemoveContainer" containerID="610b0da5ac61695659f4b784e8292e7ec4f779f921d7b2591ef60c230fc49b5f" Apr 16 19:55:47.348198 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:47.348179 2574 generic.go:358] "Generic (PLEG): container finished" podID="631ca511-9b5c-4215-97a0-cd40a1f88ddd" containerID="59b685a8972d7955a5ecb483fdb24fa645e0bff354071fe68ecd13d4e2c0c368" exitCode=0 Apr 16 19:55:47.348293 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:47.348254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" event={"ID":"631ca511-9b5c-4215-97a0-cd40a1f88ddd","Type":"ContainerDied","Data":"59b685a8972d7955a5ecb483fdb24fa645e0bff354071fe68ecd13d4e2c0c368"} Apr 16 19:55:47.348523 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:47.348501 2574 scope.go:117] "RemoveContainer" containerID="59b685a8972d7955a5ecb483fdb24fa645e0bff354071fe68ecd13d4e2c0c368" Apr 16 19:55:48.352228 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:48.352187 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nkcw5" event={"ID":"631ca511-9b5c-4215-97a0-cd40a1f88ddd","Type":"ContainerStarted","Data":"4045f45270d9d0ca93d0e375a53de1c5bd20010ccc0997c35ec741208d9dff48"} Apr 16 19:55:48.353691 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:48.353665 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p9xvg" event={"ID":"76fbc99b-bd4e-4ed8-9580-4a1845bf152f","Type":"ContainerStarted","Data":"b257057af8348a115fb3ae05ddaa93b706c5a8d3cfda6bd665b5fc7baf3bf126"} Apr 16 19:55:54.551030 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:55:54.550989 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" podUID="52f124ce-6f9c-4329-9b94-708c5c6715f6" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 19:56:04.551396 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:56:04.551354 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" podUID="52f124ce-6f9c-4329-9b94-708c5c6715f6" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 19:56:04.551873 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:56:04.551434 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" Apr 16 19:56:04.552038 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:56:04.552018 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"3d2426b77bf61563c78a68afada513c995a181f6105eebd0a6b8d69bf2c904d0"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 19:56:04.552081 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:56:04.552062 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" podUID="52f124ce-6f9c-4329-9b94-708c5c6715f6" containerName="service-proxy" containerID="cri-o://3d2426b77bf61563c78a68afada513c995a181f6105eebd0a6b8d69bf2c904d0" gracePeriod=30 Apr 16 19:56:05.402285 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:56:05.402244 2574 generic.go:358] "Generic (PLEG): container finished" podID="52f124ce-6f9c-4329-9b94-708c5c6715f6" containerID="3d2426b77bf61563c78a68afada513c995a181f6105eebd0a6b8d69bf2c904d0" exitCode=2 Apr 16 19:56:05.402458 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:56:05.402311 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" event={"ID":"52f124ce-6f9c-4329-9b94-708c5c6715f6","Type":"ContainerDied","Data":"3d2426b77bf61563c78a68afada513c995a181f6105eebd0a6b8d69bf2c904d0"} Apr 16 19:56:05.402458 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:56:05.402353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c79b8b4fd-dz825" event={"ID":"52f124ce-6f9c-4329-9b94-708c5c6715f6","Type":"ContainerStarted","Data":"f1ee94fccb6eea12b77affb004cc65db8a6d8fd7b6d82bed9f5ca3f27eb3bbcd"} Apr 16 19:58:30.703542 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:58:30.703513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 19:58:30.704048 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:58:30.703813 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 19:58:30.709265 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:58:30.709238 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 19:58:30.709421 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:58:30.709402 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 19:58:30.712854 ip-10-0-137-239 kubenswrapper[2574]: I0416 19:58:30.712833 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:03:30.723548 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:03:30.723519 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:03:30.725454 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:03:30.725432 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:03:30.728958 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:03:30.728940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:03:30.730571 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:03:30.730552 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:08:30.743771 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:08:30.743670 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:08:30.745746 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:08:30.745724 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:08:30.749003 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:08:30.748982 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:08:30.750902 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:08:30.750884 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:13:30.762998 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:13:30.762967 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:13:30.765439 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:13:30.765413 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:13:30.768243 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:13:30.768222 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:13:30.770473 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:13:30.770453 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:18:30.782455 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:18:30.782420 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:18:30.785472 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:18:30.785447 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:18:30.787671 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:18:30.787648 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:18:30.790891 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:18:30.790871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:23:30.804478 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:23:30.804452 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:23:30.808244 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:23:30.808221 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:23:30.809542 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:23:30.809524 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:23:30.812940 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:23:30.812925 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:28:30.822330 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:28:30.822301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:28:30.827117 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:28:30.827094 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:28:30.827609 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:28:30.827589 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:28:30.832344 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:28:30.832328 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:33:30.845342 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:33:30.845314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:33:30.849677 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:33:30.849654 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:33:30.850251 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:33:30.850230 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:33:30.854340 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:33:30.854320 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:38:30.863912 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:38:30.863832 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:38:30.868932 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:38:30.868906 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:38:30.869074 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:38:30.869012 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:38:30.873953 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:38:30.873933 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:43:30.891591 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:43:30.891455 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:43:30.897118 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:43:30.897098 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:43:30.897118 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:43:30.897111 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:43:30.902359 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:43:30.902341 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:48:30.910457 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:48:30.910345 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:48:30.915462 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:48:30.915441 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:48:30.917174 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:48:30.917158 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:48:30.922037 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:48:30.922020 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:53:30.929244 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:53:30.929216 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:53:30.934401 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:53:30.934365 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:53:30.937156 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:53:30.937137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:53:30.942150 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:53:30.942135 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:57:44.692581 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:44.692549 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hk2ln_dfa5879b-9279-4460-929b-8800e9ce40bf/global-pull-secret-syncer/0.log" Apr 16 20:57:44.923596 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:44.923561 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qsmh9_679903aa-1484-4696-9629-25872bd6f204/konnectivity-agent/0.log" Apr 16 20:57:45.001016 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:45.000896 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-239.ec2.internal_f370b553c7c2da44374976f1160c1b70/haproxy/0.log" Apr 16 20:57:48.481953 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:48.481915 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zlp6j_d23d4c10-6686-46c3-bcb7-85ee9826dba3/cluster-monitoring-operator/0.log" Apr 16 20:57:48.723131 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:48.723102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-67f5h_8445b411-7903-43fc-9635-1a16478893cc/node-exporter/0.log" Apr 16 20:57:48.746455 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:48.746382 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-67f5h_8445b411-7903-43fc-9635-1a16478893cc/kube-rbac-proxy/0.log" Apr 16 20:57:48.762104 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:48.762079 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-67f5h_8445b411-7903-43fc-9635-1a16478893cc/init-textfile/0.log" Apr 16 20:57:50.564719 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:50.564685 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-vrm46_57623683-701c-481e-a07f-ba6e226f7785/networking-console-plugin/0.log" Apr 16 20:57:50.989573 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:50.989535 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/1.log" Apr 16 20:57:50.997253 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:50.997230 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctpdp_efaa6ad2-6fd5-4047-90af-f4b40a394f8f/console-operator/2.log" Apr 16 20:57:51.799777 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:51.799727 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-kfft6_4f8ec4e6-16f1-43d3-a059-d546a6492815/volume-data-source-validator/0.log" Apr 16 20:57:52.160239 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.160210 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7"] Apr 16 20:57:52.160502 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.160491 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26e84dd9-5d40-41dd-95a3-95e087c04263" containerName="registry" Apr 16 20:57:52.160549 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.160504 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e84dd9-5d40-41dd-95a3-95e087c04263" containerName="registry" Apr 16 20:57:52.160586 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.160565 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="26e84dd9-5d40-41dd-95a3-95e087c04263" containerName="registry" Apr 16 20:57:52.163322 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.163306 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.165256 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.165231 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x4l79\"/\"openshift-service-ca.crt\"" Apr 16 20:57:52.165386 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.165285 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x4l79\"/\"kube-root-ca.crt\"" Apr 16 20:57:52.165386 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.165294 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x4l79\"/\"default-dockercfg-4rfsr\"" Apr 16 20:57:52.172161 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.172131 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7"] Apr 16 20:57:52.260852 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.260811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-lib-modules\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.261056 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.260863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-proc\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.261056 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.260889 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-podres\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.261056 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.260941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpshm\" (UniqueName: \"kubernetes.io/projected/9b32f140-a854-48eb-8ef9-a318fbeeb119-kube-api-access-lpshm\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.261056 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.260967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-sys\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362354 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-lib-modules\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362354 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-proc\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362569 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-podres\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362569 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362406 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpshm\" (UniqueName: \"kubernetes.io/projected/9b32f140-a854-48eb-8ef9-a318fbeeb119-kube-api-access-lpshm\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362569 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-sys\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362569 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-proc\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362569 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-sys\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362569 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-podres\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.362569 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.362507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b32f140-a854-48eb-8ef9-a318fbeeb119-lib-modules\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.369864 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.369844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpshm\" (UniqueName: \"kubernetes.io/projected/9b32f140-a854-48eb-8ef9-a318fbeeb119-kube-api-access-lpshm\") pod \"perf-node-gather-daemonset-z4sz7\" (UID: \"9b32f140-a854-48eb-8ef9-a318fbeeb119\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.474312 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.474224 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.544346 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.544323 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q7wjg_c466d975-16d2-4ae9-8d08-159d0c6f360e/dns/0.log" Apr 16 20:57:52.565440 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.565417 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q7wjg_c466d975-16d2-4ae9-8d08-159d0c6f360e/kube-rbac-proxy/0.log" Apr 16 20:57:52.594213 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.594143 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7"] Apr 16 20:57:52.596398 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.596373 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4gptc_67f8454e-2f9f-4bb4-8cc1-afbff30ea2ca/dns-node-resolver/0.log" Apr 16 20:57:52.597315 ip-10-0-137-239 kubenswrapper[2574]: W0416 20:57:52.597290 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9b32f140_a854_48eb_8ef9_a318fbeeb119.slice/crio-fca7fa9bc654bfde7f03380b4b208f49d1d741f602f013e04d8da8fe291d29c2 WatchSource:0}: Error finding container fca7fa9bc654bfde7f03380b4b208f49d1d741f602f013e04d8da8fe291d29c2: Status 404 returned error can't find the container with id fca7fa9bc654bfde7f03380b4b208f49d1d741f602f013e04d8da8fe291d29c2 Apr 16 20:57:52.598701 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.598684 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:57:52.704881 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.704849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" event={"ID":"9b32f140-a854-48eb-8ef9-a318fbeeb119","Type":"ContainerStarted","Data":"498ec655c1e002eaaf2b99aa3ad40024bbe48de0d7245c5e35fa327f04be5fda"} Apr 16 20:57:52.705050 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.704887 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" event={"ID":"9b32f140-a854-48eb-8ef9-a318fbeeb119","Type":"ContainerStarted","Data":"fca7fa9bc654bfde7f03380b4b208f49d1d741f602f013e04d8da8fe291d29c2"} Apr 16 20:57:52.705050 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.704988 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:57:52.719686 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:52.719578 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" podStartSLOduration=0.719562593 podStartE2EDuration="719.562593ms" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:57:52.718821966 +0000 UTC m=+3862.523008837" watchObservedRunningTime="2026-04-16 20:57:52.719562593 +0000 UTC m=+3862.523749464" Apr 16 20:57:53.147369 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:53.147336 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l8zj8_a03880fb-0202-4f2e-9f09-4064525141cd/node-ca/0.log" Apr 16 20:57:53.935971 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:53.935931 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-696c89d9db-bdmt2_4bbbd110-8877-408f-94d0-0ffb4ab8ed60/router/0.log" Apr 16 20:57:54.281192 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:54.281106 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9fsdq_175b0100-2c40-4ff1-993a-ed325cab1d64/serve-healthcheck-canary/0.log" Apr 16 20:57:54.675695 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:54.675653 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-t672v_c5bd5e90-47ac-41b5-bc0c-07feb9989bab/insights-operator/0.log" Apr 16 20:57:54.678101 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:54.678079 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-t672v_c5bd5e90-47ac-41b5-bc0c-07feb9989bab/insights-operator/1.log" Apr 16 20:57:54.891585 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:54.891555 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fc2mq_f24a8157-b359-4258-8496-a80f0514a050/kube-rbac-proxy/0.log" Apr 16 20:57:54.910460 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:54.910430 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fc2mq_f24a8157-b359-4258-8496-a80f0514a050/exporter/0.log" Apr 16 20:57:54.931080 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:54.931000 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fc2mq_f24a8157-b359-4258-8496-a80f0514a050/extractor/0.log" Apr 16 20:57:58.717496 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:57:58.717464 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-z4sz7" Apr 16 20:58:01.456422 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:01.456382 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nkcw5_631ca511-9b5c-4215-97a0-cd40a1f88ddd/kube-storage-version-migrator-operator/1.log" Apr 16 20:58:01.458012 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:01.457976 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nkcw5_631ca511-9b5c-4215-97a0-cd40a1f88ddd/kube-storage-version-migrator-operator/0.log" Apr 16 20:58:02.460531 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:02.460456 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5srtm_cf3e916e-8fa0-480a-b696-be499c883f60/kube-multus-additional-cni-plugins/0.log" Apr 16 20:58:02.481938 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:02.481911 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5srtm_cf3e916e-8fa0-480a-b696-be499c883f60/egress-router-binary-copy/0.log" Apr 16 20:58:02.507286 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:02.507258 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5srtm_cf3e916e-8fa0-480a-b696-be499c883f60/cni-plugins/0.log" Apr 16 20:58:02.528416 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:02.528392 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5srtm_cf3e916e-8fa0-480a-b696-be499c883f60/bond-cni-plugin/0.log" Apr 16 20:58:02.547278 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:02.547253 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5srtm_cf3e916e-8fa0-480a-b696-be499c883f60/routeoverride-cni/0.log" Apr 16 20:58:02.568450 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:02.568419 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5srtm_cf3e916e-8fa0-480a-b696-be499c883f60/whereabouts-cni-bincopy/0.log" Apr 16 20:58:02.587989 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:02.587967 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5srtm_cf3e916e-8fa0-480a-b696-be499c883f60/whereabouts-cni/0.log" Apr 16 20:58:03.013433 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:03.013338 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x4r79_6dae1add-3800-41a5-8840-7ad27c67bbec/kube-multus/0.log" Apr 16 20:58:03.087507 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:03.087474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bdzq7_884798a9-cac3-41a4-af20-f3c01d50646e/network-metrics-daemon/0.log" Apr 16 20:58:03.108422 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:03.108398 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bdzq7_884798a9-cac3-41a4-af20-f3c01d50646e/kube-rbac-proxy/0.log" Apr 16 20:58:03.883142 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:03.883112 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-controller/0.log" Apr 16 20:58:03.899962 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:03.899938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/0.log" Apr 16 20:58:03.934336 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:03.934304 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovn-acl-logging/1.log" Apr 16 20:58:03.973421 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:03.973390 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/kube-rbac-proxy-node/0.log" Apr 16 20:58:04.036305 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:04.036273 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:58:04.073020 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:04.072989 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/northd/0.log" Apr 16 20:58:04.092908 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:04.092885 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/nbdb/0.log" Apr 16 20:58:04.113788 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:04.113740 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/sbdb/0.log" Apr 16 20:58:04.280783 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:04.280686 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-726jz_b78a4e77-873d-4057-97dd-587515df3295/ovnkube-controller/0.log" Apr 16 20:58:05.936050 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:05.936021 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-qqm97_7663f14f-d056-4025-928a-0110e843ff4c/check-endpoints/0.log" Apr 16 20:58:05.957261 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:05.957236 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-56m94_fe26e572-de88-4d80-bc05-a05fc220448c/network-check-target-container/0.log" Apr 16 20:58:06.975369 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:06.975333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2xlpf_4a96e2c0-7394-4112-aea4-555bbe913368/iptables-alerter/0.log" Apr 16 20:58:07.691540 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:07.691513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-92xs8_4ad5f1fa-6312-4a33-837e-c75d2d7eb5c4/tuned/0.log" Apr 16 20:58:09.505676 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:09.505635 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-7gthr_417cddde-9bfc-4522-af0d-b947b69c5362/cluster-samples-operator/0.log" Apr 16 20:58:09.523406 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:09.523380 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-7gthr_417cddde-9bfc-4522-af0d-b947b69c5362/cluster-samples-operator-watch/0.log" Apr 16 20:58:10.411399 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:10.411367 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-p9xvg_76fbc99b-bd4e-4ed8-9580-4a1845bf152f/service-ca-operator/1.log" Apr 16 20:58:10.413393 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:10.413361 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-p9xvg_76fbc99b-bd4e-4ed8-9580-4a1845bf152f/service-ca-operator/0.log" Apr 16 20:58:11.270963 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:11.270932 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-v9sz6_8514ce9f-7048-45ba-a71b-0b89134eb13c/csi-driver/0.log" Apr 16 20:58:11.289245 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:11.289214 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-v9sz6_8514ce9f-7048-45ba-a71b-0b89134eb13c/csi-node-driver-registrar/0.log" Apr 16 20:58:11.307106 ip-10-0-137-239 kubenswrapper[2574]: I0416 20:58:11.307083 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-v9sz6_8514ce9f-7048-45ba-a71b-0b89134eb13c/csi-liveness-probe/0.log"