Apr 16 18:00:25.069973 ip-10-0-142-167 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:00:25.069987 ip-10-0-142-167 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:00:25.069997 ip-10-0-142-167 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:00:25.070321 ip-10-0-142-167 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:00:36.405239 ip-10-0-142-167 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:00:36.405253 ip-10-0-142-167 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 506db7f5b14f4541853a051ed08ea00c -- Apr 16 18:02:47.870453 ip-10-0-142-167 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:02:48.348880 ip-10-0-142-167 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:48.348880 ip-10-0-142-167 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:02:48.348880 ip-10-0-142-167 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:48.348880 ip-10-0-142-167 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:02:48.348880 ip-10-0-142-167 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:48.351588 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.351504 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:02:48.354523 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354509 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:48.354523 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354524 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354527 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354531 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354534 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354537 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354540 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354542 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354545 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354550 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354554 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354558 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354561 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354563 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354567 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354569 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354572 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354574 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354577 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354580 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354583 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:48.354583 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354586 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354588 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354591 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354594 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354597 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354600 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354602 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354605 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354607 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354610 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354614 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354617 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354621 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354623 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354626 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354629 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354632 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354634 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354637 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:48.355062 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354640 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354642 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354645 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354647 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354650 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354652 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354655 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354657 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354660 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354662 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354665 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354668 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354670 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354674 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354677 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354680 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354683 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354685 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354689 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354691 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:48.355533 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354694 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354696 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354699 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354702 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354705 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354707 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354711 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354713 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354716 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354719 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354721 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354724 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354727 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354730 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354733 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354735 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354738 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354740 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354743 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:48.356060 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354746 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354748 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354750 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354753 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354756 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354758 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.354761 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355119 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355124 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355127 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355129 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355132 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355135 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355138 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355140 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355143 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355146 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355149 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355165 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355168 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:48.356538 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355171 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355174 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355176 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355179 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355182 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355186 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355190 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355193 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355196 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355199 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355202 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355205 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355207 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355210 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355213 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355216 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355219 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355221 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355224 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355227 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:48.357015 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355231 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355233 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355236 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355238 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355241 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355244 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355246 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355249 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355251 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355254 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355257 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355259 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355263 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355265 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355268 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355270 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355273 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355275 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355278 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:48.357526 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355281 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355283 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355286 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355288 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355291 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355293 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355296 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355299 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355302 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355304 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355307 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355310 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355313 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355316 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355318 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355321 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355323 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355326 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355328 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355331 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:48.357988 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355333 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355336 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355339 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355341 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355344 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355346 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355350 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355354 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355357 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355359 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355362 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355365 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355367 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.355369 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355442 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355451 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355457 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355465 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355469 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355473 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355478 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:02:48.358498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355483 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355486 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355489 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355493 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355496 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355499 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355502 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355505 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355508 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355511 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355514 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355517 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355521 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355524 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355527 2578 flags.go:64] FLAG: --config-dir="" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355530 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355534 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355538 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355541 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355544 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355548 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355551 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355554 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355557 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355560 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:02:48.359013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355563 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355568 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355570 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355573 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355576 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355580 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355582 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355587 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355590 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355594 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355597 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355600 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355604 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355607 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355610 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355613 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355616 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355619 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355621 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355624 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355627 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355630 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355633 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355637 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355640 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:02:48.359701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355643 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355646 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355650 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355653 2578 flags.go:64] FLAG: --help="false" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355656 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355659 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355663 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355666 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355669 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355672 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355675 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355678 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355681 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355684 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355687 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355690 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355693 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355696 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355699 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355702 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355705 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355707 2578 flags.go:64] FLAG: --lock-file="" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355710 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355713 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:02:48.360325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355716 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355721 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355724 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355727 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355730 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355733 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355736 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355739 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355742 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355746 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355749 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355753 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355757 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355759 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355763 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355766 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355769 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355772 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355775 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355781 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355785 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355788 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355791 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:02:48.360895 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355794 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355800 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355803 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355806 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355809 2578 flags.go:64] FLAG: --port="10250" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355812 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355815 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05c159a495f991e5b" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355818 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355822 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355825 2578 flags.go:64] FLAG: --register-node="true" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355828 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355830 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355834 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355837 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355840 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355843 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355846 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355849 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355852 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355855 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355864 2578 flags.go:64] FLAG: --runonce="false" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355867 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355870 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355874 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355877 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355879 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:02:48.361467 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355883 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355886 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355889 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355892 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355897 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355900 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355902 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355906 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355909 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355912 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355917 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355920 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355923 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355927 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355932 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355935 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355938 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355941 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355944 2578 flags.go:64] FLAG: --v="2" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355952 2578 flags.go:64] FLAG: --version="false" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355956 2578 flags.go:64] FLAG: --vmodule="" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355960 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.355963 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356048 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:48.362106 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356051 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356054 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356057 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356060 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356062 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356065 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356068 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356072 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356076 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356079 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356082 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356085 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356088 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356091 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356093 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356096 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356099 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356101 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356104 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:48.362723 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356106 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356109 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356112 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356116 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356119 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356121 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356124 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356126 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356129 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356132 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356134 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356137 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356139 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356142 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356144 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356147 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356149 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356165 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356168 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356171 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:48.363226 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356174 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356176 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356179 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356182 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356184 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356188 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356191 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356193 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356196 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356199 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356201 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356204 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356206 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356209 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356212 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356216 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356218 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356221 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356223 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356226 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:48.363753 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356229 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356232 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356235 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356237 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356240 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356243 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356245 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356248 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356253 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356256 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356261 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356264 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356267 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356270 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356273 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356276 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356278 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356283 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356286 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356289 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:48.364262 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356291 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356294 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356296 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356299 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356302 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.356304 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.357115 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.364137 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.364150 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364207 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364212 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364215 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364219 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364221 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364224 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364227 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:48.364746 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364230 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364232 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364235 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364238 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364240 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364243 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364247 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364251 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364255 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364258 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364260 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364264 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364266 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364269 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364274 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364278 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364281 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364285 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364288 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:48.365197 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364291 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364294 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364297 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364300 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364303 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364306 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364308 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364311 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364313 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364316 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364318 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364321 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364323 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364326 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364328 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364331 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364333 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364336 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364338 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364341 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:48.365655 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364343 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364346 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364349 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364351 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364354 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364356 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364359 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364362 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364364 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364367 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364369 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364371 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364387 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364390 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364393 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364396 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364399 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364401 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364404 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:48.366233 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364406 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364409 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364411 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364414 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364417 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364419 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364422 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364424 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364427 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364429 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364432 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364434 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364437 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364439 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364442 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364445 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364447 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364449 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364452 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364454 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:48.366699 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364457 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.364462 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364554 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364559 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364562 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364565 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364568 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364571 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364574 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364577 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364580 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364582 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364585 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364588 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364590 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364593 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:48.367377 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364596 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364598 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364601 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364603 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364606 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364608 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364611 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364613 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364616 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364618 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364621 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364624 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364626 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364629 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364631 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364634 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364637 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364639 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364641 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364644 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:48.367963 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364646 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364649 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364651 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364654 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364656 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364659 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364663 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364667 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364670 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364672 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364675 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364678 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364680 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364683 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364685 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364688 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364691 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364694 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364696 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:48.368472 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364699 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364703 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364706 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364709 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364713 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364716 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364719 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364721 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364724 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364727 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364730 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364732 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364735 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364738 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364740 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364743 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364745 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364748 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364751 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:48.368930 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364753 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364756 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364758 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364761 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364764 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364766 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364768 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364771 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364774 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364776 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364779 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364781 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364783 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:48.364786 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.364790 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:48.369405 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.365466 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:02:48.372187 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.372053 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:02:48.373537 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.373526 2578 server.go:1019] "Starting client certificate rotation" Apr 16 18:02:48.373637 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.373621 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:48.373681 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.373662 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:48.399523 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.399505 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:48.402597 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.402575 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:48.414832 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.414816 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:02:48.424009 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.423991 2578 log.go:25] "Validated CRI v1 image API" Apr 16 18:02:48.425185 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.425150 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:02:48.429312 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.429292 2578 fs.go:135] Filesystem UUIDs: map[160d8c2f-dc8f-49b3-a48e-000c79615db2:/dev/nvme0n1p3 1ca18084-ab1f-422d-9db1-fdea7faaa8db:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:02:48.429385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.429310 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:02:48.431187 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.431148 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:48.435481 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.435375 2578 manager.go:217] Machine: {Timestamp:2026-04-16 18:02:48.433174923 +0000 UTC m=+0.437342921 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097246 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25de555a4e907179e8b96b3efaf74d SystemUUID:ec25de55-5a4e-9071-79e8-b96b3efaf74d BootID:506db7f5-b14f-4541-853a-051ed08ea00c Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:27:54:41:73:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:27:54:41:73:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:76:a4:60:ac:ed Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:02:48.436179 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.436169 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:02:48.436259 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.436247 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:02:48.437482 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.437452 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:02:48.437908 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.437482 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-167.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:02:48.438011 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.437920 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:02:48.438187 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.437980 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:02:48.438257 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.438209 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:48.439185 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.439172 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:48.440656 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.440644 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:48.440782 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.440771 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:02:48.443225 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.443214 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:02:48.443280 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.443239 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:02:48.443280 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.443258 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:02:48.443280 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.443271 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:02:48.443423 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.443284 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:02:48.444689 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.444608 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:48.444689 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.444636 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:48.447668 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.447654 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:02:48.449419 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.449405 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:02:48.450323 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450312 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450328 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450334 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450340 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450346 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450355 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450364 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450370 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450378 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450390 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:02:48.450411 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450398 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:02:48.450702 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.450694 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:02:48.451555 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.451545 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:02:48.451555 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.451554 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:02:48.453970 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.453952 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-167.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:02:48.454759 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.454738 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-167.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:02:48.454803 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.454736 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:02:48.454946 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.454935 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:02:48.454987 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.454969 2578 server.go:1295] "Started kubelet" Apr 16 18:02:48.455094 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.455042 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:02:48.455179 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.455078 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:02:48.455179 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.455132 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:02:48.455681 ip-10-0-142-167 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:02:48.457136 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.457123 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:02:48.459013 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.458996 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:02:48.463846 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.463830 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:02:48.463932 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.463864 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:48.464117 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.464088 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:02:48.464520 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464500 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:02:48.464520 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464523 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:02:48.464666 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464500 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:02:48.464666 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464606 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:02:48.464666 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464616 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:02:48.464801 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464712 2578 factory.go:153] Registering CRI-O factory Apr 16 18:02:48.464801 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.464720 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:48.464801 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464726 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 18:02:48.464801 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464796 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:02:48.464989 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464806 2578 factory.go:55] Registering systemd factory Apr 16 18:02:48.464989 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464813 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:02:48.464989 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464832 2578 factory.go:103] Registering Raw factory Apr 16 18:02:48.464989 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.464843 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 18:02:48.465289 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.465272 2578 manager.go:319] Starting recovery of all containers Apr 16 18:02:48.470665 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.470631 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-167.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:02:48.470941 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.470907 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:02:48.474724 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.470688 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-167.ec2.internal.18a6e8582d0ebfa2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-167.ec2.internal,UID:ip-10-0-142-167.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-167.ec2.internal,},FirstTimestamp:2026-04-16 18:02:48.454946722 +0000 UTC m=+0.459114720,LastTimestamp:2026-04-16 18:02:48.454946722 +0000 UTC m=+0.459114720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-167.ec2.internal,}" Apr 16 18:02:48.476643 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.476626 2578 manager.go:324] Recovery completed Apr 16 18:02:48.480378 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.480337 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4lnv2" Apr 16 18:02:48.481121 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.481110 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:48.483844 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.483827 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:48.483899 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.483854 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:48.483899 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.483864 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:48.484367 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.484354 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:02:48.484367 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.484366 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:02:48.484496 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.484383 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:48.485756 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.485696 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-167.ec2.internal.18a6e8582ec7a9d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-167.ec2.internal,UID:ip-10-0-142-167.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-167.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-167.ec2.internal,},FirstTimestamp:2026-04-16 18:02:48.483842514 +0000 UTC m=+0.488010511,LastTimestamp:2026-04-16 18:02:48.483842514 +0000 UTC m=+0.488010511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-167.ec2.internal,}" Apr 16 18:02:48.486766 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.486752 2578 policy_none.go:49] "None policy: Start" Apr 16 18:02:48.486828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.486770 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:02:48.486828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.486780 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:02:48.491391 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.491373 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4lnv2" Apr 16 18:02:48.492507 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.492450 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-167.ec2.internal.18a6e8582ec7e896 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-167.ec2.internal,UID:ip-10-0-142-167.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-142-167.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-142-167.ec2.internal,},FirstTimestamp:2026-04-16 18:02:48.483858582 +0000 UTC m=+0.488026580,LastTimestamp:2026-04-16 18:02:48.483858582 +0000 UTC m=+0.488026580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-167.ec2.internal,}" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.525317 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.525342 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.525350 2578 server.go:85] "Starting device plugin registration server" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.525578 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.525591 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.525753 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.525817 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.525828 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.526492 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:02:48.534477 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.526529 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:48.555656 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.555635 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:02:48.556805 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.556786 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:02:48.556889 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.556811 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:02:48.556889 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.556869 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:02:48.556889 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.556877 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:02:48.557038 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.556907 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:02:48.561360 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.561344 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:48.625824 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.625786 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:48.626634 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.626620 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:48.626702 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.626644 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:48.626702 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.626654 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:48.626702 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.626673 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.635394 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.635380 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.635438 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.635398 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-167.ec2.internal\": node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:48.657092 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.657064 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal"] Apr 16 18:02:48.657188 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.657141 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:48.658685 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.658671 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:48.658766 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.658702 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:48.658766 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.658715 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:48.659940 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.659927 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:48.660084 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660071 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.660175 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660107 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:48.660567 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660548 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:48.660647 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660552 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:48.660647 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660604 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:48.660647 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660615 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:48.660647 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660578 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:48.660768 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.660663 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:48.662324 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.662309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.662386 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.662332 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:48.662900 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.662886 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:48.662966 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.662922 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:48.662966 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.662938 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:48.665973 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.665946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e388a8259ec158bfe054771dea9bf3e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal\" (UID: \"4e388a8259ec158bfe054771dea9bf3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.666019 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.665988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e388a8259ec158bfe054771dea9bf3e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal\" (UID: \"4e388a8259ec158bfe054771dea9bf3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.666062 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.666044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f3d3ab7bc151663feafe988605589632-config\") pod \"kube-apiserver-proxy-ip-10-0-142-167.ec2.internal\" (UID: \"f3d3ab7bc151663feafe988605589632\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.697624 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.697604 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-167.ec2.internal\" not found" node="ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.701876 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.701863 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-167.ec2.internal\" not found" node="ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.711147 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.711133 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:48.766894 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.766878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e388a8259ec158bfe054771dea9bf3e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal\" (UID: \"4e388a8259ec158bfe054771dea9bf3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.766955 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.766900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e388a8259ec158bfe054771dea9bf3e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal\" (UID: \"4e388a8259ec158bfe054771dea9bf3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.766955 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.766915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f3d3ab7bc151663feafe988605589632-config\") pod \"kube-apiserver-proxy-ip-10-0-142-167.ec2.internal\" (UID: \"f3d3ab7bc151663feafe988605589632\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.766955 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.766939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f3d3ab7bc151663feafe988605589632-config\") pod \"kube-apiserver-proxy-ip-10-0-142-167.ec2.internal\" (UID: \"f3d3ab7bc151663feafe988605589632\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.767053 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.766965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e388a8259ec158bfe054771dea9bf3e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal\" (UID: \"4e388a8259ec158bfe054771dea9bf3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.767053 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:48.766984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e388a8259ec158bfe054771dea9bf3e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal\" (UID: \"4e388a8259ec158bfe054771dea9bf3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:48.811224 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.811203 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:48.912326 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:48.912274 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.000922 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.000903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:49.004898 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.004884 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" Apr 16 18:02:49.013234 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.013219 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.113755 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.113730 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.214227 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.214166 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.314783 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.314762 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.330598 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.330579 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:49.373217 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.373195 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:02:49.373810 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.373304 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:49.373810 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.373353 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:49.415489 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.415465 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.464082 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.464063 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:49.476558 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.476504 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:49.493745 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.493713 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:57:48 +0000 UTC" deadline="2028-01-26 07:48:56.355347362 +0000 UTC" Apr 16 18:02:49.493745 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.493740 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15589h46m6.861612062s" Apr 16 18:02:49.495979 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.495963 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jw5w8" Apr 16 18:02:49.508343 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.508321 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jw5w8" Apr 16 18:02:49.515791 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.515771 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.533510 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:49.533479 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3d3ab7bc151663feafe988605589632.slice/crio-326bebe4271cc60af75fac4fa9fba80078a7013de67d2ac332a6b4b1d3be1856 WatchSource:0}: Error finding container 326bebe4271cc60af75fac4fa9fba80078a7013de67d2ac332a6b4b1d3be1856: Status 404 returned error can't find the container with id 326bebe4271cc60af75fac4fa9fba80078a7013de67d2ac332a6b4b1d3be1856 Apr 16 18:02:49.533915 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:49.533891 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e388a8259ec158bfe054771dea9bf3e.slice/crio-ca2c40de3b9ade9d66b40362ded0d74a1f22ba763114b2a35120dbf6137dd51f WatchSource:0}: Error finding container ca2c40de3b9ade9d66b40362ded0d74a1f22ba763114b2a35120dbf6137dd51f: Status 404 returned error can't find the container with id ca2c40de3b9ade9d66b40362ded0d74a1f22ba763114b2a35120dbf6137dd51f Apr 16 18:02:49.538711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.538524 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:49.560122 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.560088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" event={"ID":"4e388a8259ec158bfe054771dea9bf3e","Type":"ContainerStarted","Data":"ca2c40de3b9ade9d66b40362ded0d74a1f22ba763114b2a35120dbf6137dd51f"} Apr 16 18:02:49.561037 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.561016 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" event={"ID":"f3d3ab7bc151663feafe988605589632","Type":"ContainerStarted","Data":"326bebe4271cc60af75fac4fa9fba80078a7013de67d2ac332a6b4b1d3be1856"} Apr 16 18:02:49.616480 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.616460 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.716898 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.716880 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.817371 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.817317 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.891512 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.891493 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:49.918065 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:49.918043 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-167.ec2.internal\" not found" Apr 16 18:02:49.984351 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:49.984330 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:50.064987 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.064792 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" Apr 16 18:02:50.083416 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.083362 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:50.084900 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.084858 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" Apr 16 18:02:50.103402 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.103383 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:50.444030 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.443973 2578 apiserver.go:52] "Watching apiserver" Apr 16 18:02:50.451118 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.451091 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:02:50.451437 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.451415 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsvsh","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal","openshift-multus/multus-7lp9m","openshift-multus/network-metrics-daemon-ndzmp","openshift-network-diagnostics/network-check-target-6tmgb","kube-system/konnectivity-agent-zlrpl","kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal","openshift-cluster-node-tuning-operator/tuned-kt6vt","openshift-image-registry/node-ca-dmn8s","openshift-multus/multus-additional-cni-plugins-n6d8n","openshift-network-operator/iptables-alerter-2bz8t"] Apr 16 18:02:50.453684 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.453662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.454871 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.454852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.456123 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456091 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.456390 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456361 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:02:50.456491 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456458 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:02:50.456601 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kfgnr\"" Apr 16 18:02:50.456898 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456861 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:02:50.457000 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456915 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:02:50.457000 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456946 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:02:50.457106 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.456999 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:02:50.457328 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.457309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:50.457512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.457384 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:02:50.457602 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.457585 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:02:50.457694 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.457617 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nf4sf\"" Apr 16 18:02:50.457759 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.457740 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:02:50.457903 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.457883 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:02:50.458756 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.458498 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:02:50.458756 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.458695 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nh8xf\"" Apr 16 18:02:50.458756 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.458707 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:02:50.458945 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.458930 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:02:50.459342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.459316 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:02:50.460186 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.460168 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:50.460274 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.460250 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.460274 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.460252 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:02:50.461819 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.461802 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.462719 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.462648 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:02:50.462719 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.462705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:02:50.462852 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.462742 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pmbd2\"" Apr 16 18:02:50.463077 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.463058 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.465238 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.464479 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:02:50.465238 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.464513 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:02:50.465238 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.464835 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:02:50.465238 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.464915 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-h2kvz\"" Apr 16 18:02:50.465474 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.465400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.467120 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.466247 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mkd72\"" Apr 16 18:02:50.467120 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.466264 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:02:50.467120 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.466288 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:02:50.468954 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.468571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.468954 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.468799 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:50.468954 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.468828 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kslbq\"" Apr 16 18:02:50.468954 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.468829 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:02:50.469217 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.469130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:50.471101 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.471079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-269wx\"" Apr 16 18:02:50.471327 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.471311 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:50.471424 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.471408 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:50.475883 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.475862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.475997 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.475979 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-cni-bin\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.476103 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476090 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-cnibin\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.476223 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-hostroot\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.476329 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476310 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.476389 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt8kb\" (UniqueName: \"kubernetes.io/projected/b3ae8fd0-5875-4adc-8994-9c74852c6397-kube-api-access-vt8kb\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.476445 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476396 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4bs5\" (UniqueName: \"kubernetes.io/projected/60136db5-eb71-48af-b059-62d18f47a211-kube-api-access-g4bs5\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.476445 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476430 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-lib-modules\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.476540 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/47f30843-f6cf-4b8f-97fc-e1c1b5c83d63-konnectivity-ca\") pod \"konnectivity-agent-zlrpl\" (UID: \"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63\") " pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.476540 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0b63c55-85f3-4126-9cbf-dac101325a0b-host-slash\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.476540 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476496 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-env-overrides\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.476540 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476517 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.476712 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk72v\" (UniqueName: \"kubernetes.io/projected/3ee28cfd-b76c-488a-8374-405ee3a9a635-kube-api-access-lk72v\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.476712 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-modprobe-d\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.476712 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-etc-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.476712 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-os-release\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.476712 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476660 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysctl-conf\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.476712 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-slash\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476744 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-etc-kubernetes\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60136db5-eb71-48af-b059-62d18f47a211-serviceca\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-run\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-registration-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzck\" (UniqueName: \"kubernetes.io/projected/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-kube-api-access-jrzck\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476896 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-tmp\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqlm\" (UniqueName: \"kubernetes.io/projected/8c94f1e2-89d6-435e-884f-a0a41da4b42f-kube-api-access-cwqlm\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.476963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476947 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c8eed63-c467-428b-aa99-b72e120b58e9-cni-binary-copy\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-socket-dir-parent\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.476993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-k8s-cni-cncf-io\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-kubelet\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r58b\" (UniqueName: \"kubernetes.io/projected/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-kube-api-access-2r58b\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/47f30843-f6cf-4b8f-97fc-e1c1b5c83d63-agent-certs\") pod \"konnectivity-agent-zlrpl\" (UID: \"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63\") " pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysctl-d\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-kubelet\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477232 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-var-lib-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.477317 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-cni-multus\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-system-cni-dir\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-host\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-socket-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477457 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysconfig\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-systemd-units\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovnkube-script-lib\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-cni-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-conf-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-multus-certs\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-var-lib-kubelet\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.477711 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-systemd\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-log-socket\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovnkube-config\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-device-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkkzh\" (UniqueName: \"kubernetes.io/projected/a0b63c55-85f3-4126-9cbf-dac101325a0b-kube-api-access-kkkzh\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-run-netns\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-cni-netd\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovn-node-metrics-cert\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.477984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-system-cni-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcgg2\" (UniqueName: \"kubernetes.io/projected/8c8eed63-c467-428b-aa99-b72e120b58e9-kube-api-access-vcgg2\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a0b63c55-85f3-4126-9cbf-dac101325a0b-iptables-alerter-script\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478075 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-ovn\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-os-release\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-netns\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-systemd\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.478385 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-sys\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-kubernetes\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-tuned\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-etc-selinux\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-daemon-config\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-cnibin\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-node-log\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60136db5-eb71-48af-b059-62d18f47a211-host\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-cni-bin\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-sys-fs\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.479022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.478598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:50.509109 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.509074 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:49 +0000 UTC" deadline="2027-10-30 11:17:29.434730126 +0000 UTC" Apr 16 18:02:50.509109 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.509108 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13481h14m38.925624464s" Apr 16 18:02:50.565065 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.565046 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:02:50.578996 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.578973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-node-log\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.579116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579001 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.579116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60136db5-eb71-48af-b059-62d18f47a211-host\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.579116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-cni-bin\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.579116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-cni-bin\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.579116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.579116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60136db5-eb71-48af-b059-62d18f47a211-host\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.579116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-node-log\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-sys-fs\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-sys-fs\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-cni-bin\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-cnibin\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.579337 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-hostroot\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-hostroot\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-cni-bin\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.579421 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs podName:2f073ea3-db3b-4eaa-9a74-db58c9d97b21 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:51.079385326 +0000 UTC m=+3.083553314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs") pod "network-metrics-daemon-ndzmp" (UID: "2f073ea3-db3b-4eaa-9a74-db58c9d97b21") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579421 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-cnibin\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.579498 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579483 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt8kb\" (UniqueName: \"kubernetes.io/projected/b3ae8fd0-5875-4adc-8994-9c74852c6397-kube-api-access-vt8kb\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4bs5\" (UniqueName: \"kubernetes.io/projected/60136db5-eb71-48af-b059-62d18f47a211-kube-api-access-g4bs5\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-lib-modules\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579535 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/47f30843-f6cf-4b8f-97fc-e1c1b5c83d63-konnectivity-ca\") pod \"konnectivity-agent-zlrpl\" (UID: \"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63\") " pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0b63c55-85f3-4126-9cbf-dac101325a0b-host-slash\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0b63c55-85f3-4126-9cbf-dac101325a0b-host-slash\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.579976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-lib-modules\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-env-overrides\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk72v\" (UniqueName: \"kubernetes.io/projected/3ee28cfd-b76c-488a-8374-405ee3a9a635-kube-api-access-lk72v\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.580270 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-modprobe-d\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.580788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-etc-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.580788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-os-release\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.580788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysctl-conf\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.580788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-slash\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.580788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-etc-kubernetes\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.580788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60136db5-eb71-48af-b059-62d18f47a211-serviceca\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.580788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/47f30843-f6cf-4b8f-97fc-e1c1b5c83d63-konnectivity-ca\") pod \"konnectivity-agent-zlrpl\" (UID: \"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63\") " pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-env-overrides\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysctl-conf\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580959 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60136db5-eb71-48af-b059-62d18f47a211-serviceca\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-modprobe-d\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581041 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-slash\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.580443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581041 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-etc-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.581132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-etc-kubernetes\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-run\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581206 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-os-release\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-registration-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-run\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzck\" (UniqueName: \"kubernetes.io/projected/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-kube-api-access-jrzck\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-tmp\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-registration-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581361 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqlm\" (UniqueName: \"kubernetes.io/projected/8c94f1e2-89d6-435e-884f-a0a41da4b42f-kube-api-access-cwqlm\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.581553 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c8eed63-c467-428b-aa99-b72e120b58e9-cni-binary-copy\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-socket-dir-parent\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-k8s-cni-cncf-io\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-kubelet\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r58b\" (UniqueName: \"kubernetes.io/projected/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-kube-api-access-2r58b\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-k8s-cni-cncf-io\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-socket-dir-parent\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/47f30843-f6cf-4b8f-97fc-e1c1b5c83d63-agent-certs\") pod \"konnectivity-agent-zlrpl\" (UID: \"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63\") " pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysctl-d\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-kubelet\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581798 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-var-lib-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-cni-multus\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-system-cni-dir\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581942 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-host\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.581971 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.581972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-socket-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysconfig\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-systemd-units\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovnkube-script-lib\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-cni-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-conf-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-multus-certs\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-host\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysctl-d\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-var-lib-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c8eed63-c467-428b-aa99-b72e120b58e9-cni-binary-copy\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582472 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-cni-multus\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-system-cni-dir\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-var-lib-kubelet\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582567 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-sysconfig\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-cni-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.582868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582562 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-kubelet\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-systemd-units\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-var-lib-kubelet\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-var-lib-kubelet\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-systemd\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-log-socket\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovnkube-config\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-device-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkkzh\" (UniqueName: \"kubernetes.io/projected/a0b63c55-85f3-4126-9cbf-dac101325a0b-kube-api-access-kkkzh\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-run-netns\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-cni-netd\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovn-node-metrics-cert\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583030 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-system-cni-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcgg2\" (UniqueName: \"kubernetes.io/projected/8c8eed63-c467-428b-aa99-b72e120b58e9-kube-api-access-vcgg2\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a0b63c55-85f3-4126-9cbf-dac101325a0b-iptables-alerter-script\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-ovn\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.583730 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovnkube-script-lib\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-log-socket\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-conf-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583322 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-multus-certs\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3ee28cfd-b76c-488a-8374-405ee3a9a635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-os-release\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-netns\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-systemd\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-systemd\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-sys\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-kubernetes\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-socket-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583504 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-tuned\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-etc-selinux\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-cni-netd\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-daemon-config\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583616 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-systemd\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.584608 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovnkube-config\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583666 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-cnibin\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-sys\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.582851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-openvswitch\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-device-dir\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-host-run-netns\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-kubernetes\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ee28cfd-b76c-488a-8374-405ee3a9a635-cnibin\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.583993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-os-release\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.584001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8c94f1e2-89d6-435e-884f-a0a41da4b42f-etc-selinux\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.584498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-system-cni-dir\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.584733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8c8eed63-c467-428b-aa99-b72e120b58e9-multus-daemon-config\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.585018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b3ae8fd0-5875-4adc-8994-9c74852c6397-run-ovn\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.585020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c8eed63-c467-428b-aa99-b72e120b58e9-host-run-netns\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.585442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a0b63c55-85f3-4126-9cbf-dac101325a0b-iptables-alerter-script\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.586602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-tmp\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.586785 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.586803 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:50.586994 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.586817 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ssvb2 for pod openshift-network-diagnostics/network-check-target-6tmgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:50.588098 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:50.586903 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2 podName:a336e08f-92e1-4f5f-99d6-9f8231b01727 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:51.086886261 +0000 UTC m=+3.091054263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ssvb2" (UniqueName: "kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2") pod "network-check-target-6tmgb" (UID: "a336e08f-92e1-4f5f-99d6-9f8231b01727") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:50.588098 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.586899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b3ae8fd0-5875-4adc-8994-9c74852c6397-ovn-node-metrics-cert\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.588098 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.587219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/47f30843-f6cf-4b8f-97fc-e1c1b5c83d63-agent-certs\") pod \"konnectivity-agent-zlrpl\" (UID: \"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63\") " pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.588098 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.587781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-etc-tuned\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.589239 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.589216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4bs5\" (UniqueName: \"kubernetes.io/projected/60136db5-eb71-48af-b059-62d18f47a211-kube-api-access-g4bs5\") pod \"node-ca-dmn8s\" (UID: \"60136db5-eb71-48af-b059-62d18f47a211\") " pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.589913 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.589487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt8kb\" (UniqueName: \"kubernetes.io/projected/b3ae8fd0-5875-4adc-8994-9c74852c6397-kube-api-access-vt8kb\") pod \"ovnkube-node-gsvsh\" (UID: \"b3ae8fd0-5875-4adc-8994-9c74852c6397\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.590744 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.590703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzck\" (UniqueName: \"kubernetes.io/projected/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-kube-api-access-jrzck\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:50.591092 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.590950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r58b\" (UniqueName: \"kubernetes.io/projected/f3e6d8de-b84b-46b8-afca-2bb1d7a16da2-kube-api-access-2r58b\") pod \"tuned-kt6vt\" (UID: \"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2\") " pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:50.591249 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.591231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqlm\" (UniqueName: \"kubernetes.io/projected/8c94f1e2-89d6-435e-884f-a0a41da4b42f-kube-api-access-cwqlm\") pod \"aws-ebs-csi-driver-node-kpvvv\" (UID: \"8c94f1e2-89d6-435e-884f-a0a41da4b42f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.591541 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.591520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk72v\" (UniqueName: \"kubernetes.io/projected/3ee28cfd-b76c-488a-8374-405ee3a9a635-kube-api-access-lk72v\") pod \"multus-additional-cni-plugins-n6d8n\" (UID: \"3ee28cfd-b76c-488a-8374-405ee3a9a635\") " pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.591909 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.591888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkkzh\" (UniqueName: \"kubernetes.io/projected/a0b63c55-85f3-4126-9cbf-dac101325a0b-kube-api-access-kkkzh\") pod \"iptables-alerter-2bz8t\" (UID: \"a0b63c55-85f3-4126-9cbf-dac101325a0b\") " pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.592455 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.592437 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcgg2\" (UniqueName: \"kubernetes.io/projected/8c8eed63-c467-428b-aa99-b72e120b58e9-kube-api-access-vcgg2\") pod \"multus-7lp9m\" (UID: \"8c8eed63-c467-428b-aa99-b72e120b58e9\") " pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.740002 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.739928 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:50.766429 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.766407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:02:50.774071 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.774046 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" Apr 16 18:02:50.782690 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.782668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7lp9m" Apr 16 18:02:50.787390 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.787368 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:02:50.794884 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.794862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dmn8s" Apr 16 18:02:50.800378 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.800359 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" Apr 16 18:02:50.806844 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.806826 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2bz8t" Apr 16 18:02:50.812373 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:50.812357 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" Apr 16 18:02:51.087338 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.087269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:51.087338 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.087316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:51.087491 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:51.087387 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:51.087491 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:51.087407 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:51.087491 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:51.087418 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ssvb2 for pod openshift-network-diagnostics/network-check-target-6tmgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:51.087491 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:51.087419 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:51.087491 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:51.087471 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2 podName:a336e08f-92e1-4f5f-99d6-9f8231b01727 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:52.087457366 +0000 UTC m=+4.091625351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ssvb2" (UniqueName: "kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2") pod "network-check-target-6tmgb" (UID: "a336e08f-92e1-4f5f-99d6-9f8231b01727") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:51.087491 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:51.087487 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs podName:2f073ea3-db3b-4eaa-9a74-db58c9d97b21 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:52.08748034 +0000 UTC m=+4.091648324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs") pod "network-metrics-daemon-ndzmp" (UID: "2f073ea3-db3b-4eaa-9a74-db58c9d97b21") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:51.120221 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:51.120190 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b63c55_85f3_4126_9cbf_dac101325a0b.slice/crio-a5bc349b92de7b08c0260e680fb4d0000fcb8b019b029755d96f74c4dfc131fc WatchSource:0}: Error finding container a5bc349b92de7b08c0260e680fb4d0000fcb8b019b029755d96f74c4dfc131fc: Status 404 returned error can't find the container with id a5bc349b92de7b08c0260e680fb4d0000fcb8b019b029755d96f74c4dfc131fc Apr 16 18:02:51.121559 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:51.121533 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60136db5_eb71_48af_b059_62d18f47a211.slice/crio-bb3fb33a8683d29a91a97ffe326f25502f01c167cb42d8b9784f7c962639ff3c WatchSource:0}: Error finding container bb3fb33a8683d29a91a97ffe326f25502f01c167cb42d8b9784f7c962639ff3c: Status 404 returned error can't find the container with id bb3fb33a8683d29a91a97ffe326f25502f01c167cb42d8b9784f7c962639ff3c Apr 16 18:02:51.123332 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:51.123313 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47f30843_f6cf_4b8f_97fc_e1c1b5c83d63.slice/crio-43f56d4fd484ee216ab2fbdfda9594683a85ff39315beb2e102b873dbf095844 WatchSource:0}: Error finding container 43f56d4fd484ee216ab2fbdfda9594683a85ff39315beb2e102b873dbf095844: Status 404 returned error can't find the container with id 43f56d4fd484ee216ab2fbdfda9594683a85ff39315beb2e102b873dbf095844 Apr 16 18:02:51.125577 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:02:51.125552 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ee28cfd_b76c_488a_8374_405ee3a9a635.slice/crio-f039fe9552c5cd0c5908452a33434dac9bf2f16e9cdfb07dc5c6b6cc73ff0eda WatchSource:0}: Error finding container f039fe9552c5cd0c5908452a33434dac9bf2f16e9cdfb07dc5c6b6cc73ff0eda: Status 404 returned error can't find the container with id f039fe9552c5cd0c5908452a33434dac9bf2f16e9cdfb07dc5c6b6cc73ff0eda Apr 16 18:02:51.509957 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.509848 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:49 +0000 UTC" deadline="2027-10-07 11:54:43.948100421 +0000 UTC" Apr 16 18:02:51.509957 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.509887 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12929h51m52.438217953s" Apr 16 18:02:51.572719 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.572666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" event={"ID":"f3d3ab7bc151663feafe988605589632","Type":"ContainerStarted","Data":"a8da253046e3d3449503705eb5313c1752a8e10c42e208081b74efb72903f935"} Apr 16 18:02:51.578216 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.578139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7lp9m" event={"ID":"8c8eed63-c467-428b-aa99-b72e120b58e9","Type":"ContainerStarted","Data":"901b718127a100f6d5488e4b71d012ad7494205bd8c8f3c9b740257c151394b2"} Apr 16 18:02:51.584963 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.584918 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" event={"ID":"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2","Type":"ContainerStarted","Data":"75818bcfeb8125d486e5243a8280805cf50fc80bc6a6259d3e703196bc8371ed"} Apr 16 18:02:51.590506 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.590483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" event={"ID":"8c94f1e2-89d6-435e-884f-a0a41da4b42f","Type":"ContainerStarted","Data":"13be84c35cea318c5aefdec5078cab348ea5586a9af22d9827eb00889393872e"} Apr 16 18:02:51.596352 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.596331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zlrpl" event={"ID":"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63","Type":"ContainerStarted","Data":"43f56d4fd484ee216ab2fbdfda9594683a85ff39315beb2e102b873dbf095844"} Apr 16 18:02:51.601630 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.601607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dmn8s" event={"ID":"60136db5-eb71-48af-b059-62d18f47a211","Type":"ContainerStarted","Data":"bb3fb33a8683d29a91a97ffe326f25502f01c167cb42d8b9784f7c962639ff3c"} Apr 16 18:02:51.603775 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.603715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"70f64d779029ca0c07ca4fd0ac569b47e86274c15c3c1fea68a7bbd416e42110"} Apr 16 18:02:51.606657 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.606590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerStarted","Data":"f039fe9552c5cd0c5908452a33434dac9bf2f16e9cdfb07dc5c6b6cc73ff0eda"} Apr 16 18:02:51.608251 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.608202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2bz8t" event={"ID":"a0b63c55-85f3-4126-9cbf-dac101325a0b","Type":"ContainerStarted","Data":"a5bc349b92de7b08c0260e680fb4d0000fcb8b019b029755d96f74c4dfc131fc"} Apr 16 18:02:51.699405 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:51.699378 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:52.094425 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:52.094389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:52.094568 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:52.094441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:52.094624 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.094604 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:52.094672 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.094663 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs podName:2f073ea3-db3b-4eaa-9a74-db58c9d97b21 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:54.094645062 +0000 UTC m=+6.098813050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs") pod "network-metrics-daemon-ndzmp" (UID: "2f073ea3-db3b-4eaa-9a74-db58c9d97b21") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:52.095099 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.095080 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:52.095211 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.095104 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:52.095211 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.095117 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ssvb2 for pod openshift-network-diagnostics/network-check-target-6tmgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:52.095211 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.095177 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2 podName:a336e08f-92e1-4f5f-99d6-9f8231b01727 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:54.095148808 +0000 UTC m=+6.099316799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ssvb2" (UniqueName: "kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2") pod "network-check-target-6tmgb" (UID: "a336e08f-92e1-4f5f-99d6-9f8231b01727") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:52.558326 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:52.558295 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:52.558794 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.558424 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:02:52.558794 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:52.558502 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:52.558794 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:52.558624 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:02:52.618215 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:52.617653 2578 generic.go:358] "Generic (PLEG): container finished" podID="4e388a8259ec158bfe054771dea9bf3e" containerID="e2efc0743fa97babaae79c357d2e2fcea0a208931c335dcf5e3a41836f4f6981" exitCode=0 Apr 16 18:02:52.618215 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:52.618170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" event={"ID":"4e388a8259ec158bfe054771dea9bf3e","Type":"ContainerDied","Data":"e2efc0743fa97babaae79c357d2e2fcea0a208931c335dcf5e3a41836f4f6981"} Apr 16 18:02:52.633573 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:52.633523 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-167.ec2.internal" podStartSLOduration=2.633506947 podStartE2EDuration="2.633506947s" podCreationTimestamp="2026-04-16 18:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:51.588301423 +0000 UTC m=+3.592469433" watchObservedRunningTime="2026-04-16 18:02:52.633506947 +0000 UTC m=+4.637674943" Apr 16 18:02:53.626758 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:53.626324 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" event={"ID":"4e388a8259ec158bfe054771dea9bf3e","Type":"ContainerStarted","Data":"65250ea66dbcc05eed7f43aa9c54876ae3407dc1681fd77e9cb5e4107da3086d"} Apr 16 18:02:54.110540 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.110456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:54.110540 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.110512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:54.110760 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.110657 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:54.110760 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.110715 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs podName:2f073ea3-db3b-4eaa-9a74-db58c9d97b21 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:58.110697539 +0000 UTC m=+10.114865528 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs") pod "network-metrics-daemon-ndzmp" (UID: "2f073ea3-db3b-4eaa-9a74-db58c9d97b21") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:54.111129 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.111109 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:54.111239 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.111138 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:54.111239 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.111164 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ssvb2 for pod openshift-network-diagnostics/network-check-target-6tmgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:54.111239 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.111208 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2 podName:a336e08f-92e1-4f5f-99d6-9f8231b01727 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:58.11119328 +0000 UTC m=+10.115361270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ssvb2" (UniqueName: "kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2") pod "network-check-target-6tmgb" (UID: "a336e08f-92e1-4f5f-99d6-9f8231b01727") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:54.492912 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.491950 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-167.ec2.internal" podStartSLOduration=4.491928523 podStartE2EDuration="4.491928523s" podCreationTimestamp="2026-04-16 18:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:53.643511761 +0000 UTC m=+5.647679773" watchObservedRunningTime="2026-04-16 18:02:54.491928523 +0000 UTC m=+6.496096532" Apr 16 18:02:54.492912 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.492196 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-28sv4"] Apr 16 18:02:54.494576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.494171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.494576 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.494243 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:02:54.558486 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.558457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:54.558604 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.558580 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:02:54.558776 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.558463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:54.558776 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.558750 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:02:54.614476 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.614356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.614476 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.614425 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6fd513cc-2c53-4020-94b3-faf51a11b03f-kubelet-config\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.614476 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.614450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6fd513cc-2c53-4020-94b3-faf51a11b03f-dbus\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.715249 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.715206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.715683 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.715287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6fd513cc-2c53-4020-94b3-faf51a11b03f-kubelet-config\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.715683 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.715315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6fd513cc-2c53-4020-94b3-faf51a11b03f-dbus\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.715683 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.715476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6fd513cc-2c53-4020-94b3-faf51a11b03f-dbus\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:54.715683 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.715583 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:54.715683 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:54.715639 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret podName:6fd513cc-2c53-4020-94b3-faf51a11b03f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:55.215617592 +0000 UTC m=+7.219785582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret") pod "global-pull-secret-syncer-28sv4" (UID: "6fd513cc-2c53-4020-94b3-faf51a11b03f") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:54.715929 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:54.715889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6fd513cc-2c53-4020-94b3-faf51a11b03f-kubelet-config\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:55.219365 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:55.219332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:55.219544 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:55.219519 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:55.219603 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:55.219574 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret podName:6fd513cc-2c53-4020-94b3-faf51a11b03f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:56.21955796 +0000 UTC m=+8.223725968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret") pod "global-pull-secret-syncer-28sv4" (UID: "6fd513cc-2c53-4020-94b3-faf51a11b03f") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:56.227456 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:56.227416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:56.227885 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:56.227586 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:56.227885 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:56.227660 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret podName:6fd513cc-2c53-4020-94b3-faf51a11b03f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:58.227640608 +0000 UTC m=+10.231808606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret") pod "global-pull-secret-syncer-28sv4" (UID: "6fd513cc-2c53-4020-94b3-faf51a11b03f") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:56.557946 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:56.557720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:56.557946 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:56.557860 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:02:56.558087 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:56.557940 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:56.558087 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:56.558050 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:02:56.558187 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:56.558100 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:56.558245 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:56.558203 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:02:58.141182 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:58.141129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:58.141622 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:58.141197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:58.141622 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.141294 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:58.141622 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.141315 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:58.141622 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.141328 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ssvb2 for pod openshift-network-diagnostics/network-check-target-6tmgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:58.141622 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.141366 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:58.141622 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.141382 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2 podName:a336e08f-92e1-4f5f-99d6-9f8231b01727 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:06.141364631 +0000 UTC m=+18.145532621 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ssvb2" (UniqueName: "kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2") pod "network-check-target-6tmgb" (UID: "a336e08f-92e1-4f5f-99d6-9f8231b01727") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:58.141622 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.141426 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs podName:2f073ea3-db3b-4eaa-9a74-db58c9d97b21 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:06.141409354 +0000 UTC m=+18.145577345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs") pod "network-metrics-daemon-ndzmp" (UID: "2f073ea3-db3b-4eaa-9a74-db58c9d97b21") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:58.242009 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:58.241974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:58.242186 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.242068 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:58.242186 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.242139 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret podName:6fd513cc-2c53-4020-94b3-faf51a11b03f nodeName:}" failed. No retries permitted until 2026-04-16 18:03:02.242125647 +0000 UTC m=+14.246293631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret") pod "global-pull-secret-syncer-28sv4" (UID: "6fd513cc-2c53-4020-94b3-faf51a11b03f") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:58.559073 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:58.558542 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:02:58.559073 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.558667 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:02:58.559073 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:58.558711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:02:58.559073 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:02:58.558722 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:02:58.559073 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.558808 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:02:58.559073 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:02:58.558885 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:00.557464 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:00.557430 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:00.557861 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:00.557430 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:00.557861 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:00.557569 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:00.557861 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:00.557426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:00.557861 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:00.557655 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:00.557861 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:00.557704 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:02.272019 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:02.271988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:02.272413 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:02.272100 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:03:02.272413 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:02.272172 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret podName:6fd513cc-2c53-4020-94b3-faf51a11b03f nodeName:}" failed. No retries permitted until 2026-04-16 18:03:10.272145291 +0000 UTC m=+22.276313277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret") pod "global-pull-secret-syncer-28sv4" (UID: "6fd513cc-2c53-4020-94b3-faf51a11b03f") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:03:02.557955 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:02.557872 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:02.557955 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:02.557894 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:02.558184 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:02.557888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:02.558184 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:02.557993 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:02.558184 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:02.558095 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:02.558336 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:02.558208 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:04.557548 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:04.557516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:04.557980 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:04.557556 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:04.557980 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:04.557516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:04.557980 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:04.557642 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:04.557980 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:04.557739 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:04.557980 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:04.557803 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:06.204646 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:06.204609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:06.205202 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:06.204654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:06.205202 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.204766 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:06.205202 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.204792 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:06.205202 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.204816 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:06.205202 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.204828 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ssvb2 for pod openshift-network-diagnostics/network-check-target-6tmgb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:06.205202 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.204832 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs podName:2f073ea3-db3b-4eaa-9a74-db58c9d97b21 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.20481468 +0000 UTC m=+34.208982666 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs") pod "network-metrics-daemon-ndzmp" (UID: "2f073ea3-db3b-4eaa-9a74-db58c9d97b21") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:06.205202 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.204877 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2 podName:a336e08f-92e1-4f5f-99d6-9f8231b01727 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.20486161 +0000 UTC m=+34.209029597 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ssvb2" (UniqueName: "kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2") pod "network-check-target-6tmgb" (UID: "a336e08f-92e1-4f5f-99d6-9f8231b01727") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:06.557462 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:06.557328 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:06.557462 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:06.557358 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:06.557462 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.557452 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:06.557743 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:06.557499 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:06.557743 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.557614 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:06.557743 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:06.557688 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:08.558326 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.558027 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:08.558911 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.558108 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:08.558911 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:08.558402 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:08.558911 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.558130 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:08.558911 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:08.558466 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:08.558911 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:08.558570 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:08.653005 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.652973 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7lp9m" event={"ID":"8c8eed63-c467-428b-aa99-b72e120b58e9","Type":"ContainerStarted","Data":"0c85ccc2a17a0c7e0cd73b8059b8eedca906d1df2d315cd39d2d723404c0a048"} Apr 16 18:03:08.654604 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.654576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" event={"ID":"f3e6d8de-b84b-46b8-afca-2bb1d7a16da2","Type":"ContainerStarted","Data":"3955d61674f00edc12569caa03f9719fc66ea13d530c9a2287729cff15385c7d"} Apr 16 18:03:08.655953 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.655922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" event={"ID":"8c94f1e2-89d6-435e-884f-a0a41da4b42f","Type":"ContainerStarted","Data":"5c3f6f1543e94b4e720392a66cab88990009eb695f39999b5fd892ad1fbcd92d"} Apr 16 18:03:08.657341 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.657312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zlrpl" event={"ID":"47f30843-f6cf-4b8f-97fc-e1c1b5c83d63","Type":"ContainerStarted","Data":"1c2c6c6135d264621f18aa01aebdbb63b04b0da00aa7576f54f92b91a17d0b63"} Apr 16 18:03:08.659078 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.659055 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dmn8s" event={"ID":"60136db5-eb71-48af-b059-62d18f47a211","Type":"ContainerStarted","Data":"dcf99aab557f216f9445570c99724ba87cf4b0b5b02d8e51983e5981336429e0"} Apr 16 18:03:08.661709 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.661688 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:03:08.661996 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.661973 2578 generic.go:358] "Generic (PLEG): container finished" podID="b3ae8fd0-5875-4adc-8994-9c74852c6397" containerID="f3cd45913950cbdaa457d4f4444e115c01ee8d7a5af81c345d063d24cfb87c79" exitCode=1 Apr 16 18:03:08.662176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.661995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"5e33b10bae2cd5ce6475f57d0430e79cea6ba5d1f95161cb09cd0baf8f4d676c"} Apr 16 18:03:08.662176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.662018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"1332b02ef5765892c15c3df279aaf2d966797704c3e2b97f6a5ce784e5f4a7f1"} Apr 16 18:03:08.662176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.662031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"0d72d70678d978790cb5e0fa07c5a1006c97b97d3045882d17a3b8b4c962f368"} Apr 16 18:03:08.662176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.662045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"94ad0682070190f0a05e577247e533bee8effb2f6603110d34d1626a7d8c8e24"} Apr 16 18:03:08.662176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.662056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerDied","Data":"f3cd45913950cbdaa457d4f4444e115c01ee8d7a5af81c345d063d24cfb87c79"} Apr 16 18:03:08.662176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.662071 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"a15ffa4c0ddcb5d8b7133e253420e58ccd84a46d9b0923d0d09522f146348399"} Apr 16 18:03:08.663315 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.663294 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ee28cfd-b76c-488a-8374-405ee3a9a635" containerID="e821f9cd7c4d773fe21e37ff8f9fdcab93ddafd1b5154f86d997df0fd78bfa2d" exitCode=0 Apr 16 18:03:08.663419 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.663328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerDied","Data":"e821f9cd7c4d773fe21e37ff8f9fdcab93ddafd1b5154f86d997df0fd78bfa2d"} Apr 16 18:03:08.675351 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.675314 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7lp9m" podStartSLOduration=3.866381737 podStartE2EDuration="20.675303007s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.132921811 +0000 UTC m=+3.137089799" lastFinishedPulling="2026-04-16 18:03:07.941843069 +0000 UTC m=+19.946011069" observedRunningTime="2026-04-16 18:03:08.674710871 +0000 UTC m=+20.678878880" watchObservedRunningTime="2026-04-16 18:03:08.675303007 +0000 UTC m=+20.679471015" Apr 16 18:03:08.687226 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.687184 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zlrpl" podStartSLOduration=8.653670539 podStartE2EDuration="20.687169534s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.124938946 +0000 UTC m=+3.129106932" lastFinishedPulling="2026-04-16 18:03:03.158437938 +0000 UTC m=+15.162605927" observedRunningTime="2026-04-16 18:03:08.686707771 +0000 UTC m=+20.690875808" watchObservedRunningTime="2026-04-16 18:03:08.687169534 +0000 UTC m=+20.691337536" Apr 16 18:03:08.715691 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.715656 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dmn8s" podStartSLOduration=3.929173031 podStartE2EDuration="20.715646635s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.124332674 +0000 UTC m=+3.128500663" lastFinishedPulling="2026-04-16 18:03:07.910806264 +0000 UTC m=+19.914974267" observedRunningTime="2026-04-16 18:03:08.699374281 +0000 UTC m=+20.703542288" watchObservedRunningTime="2026-04-16 18:03:08.715646635 +0000 UTC m=+20.719814642" Apr 16 18:03:08.734041 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:08.734001 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kt6vt" podStartSLOduration=2.925149012 podStartE2EDuration="19.733987855s" podCreationTimestamp="2026-04-16 18:02:49 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.13267809 +0000 UTC m=+3.136846080" lastFinishedPulling="2026-04-16 18:03:07.941516923 +0000 UTC m=+19.945684923" observedRunningTime="2026-04-16 18:03:08.733617665 +0000 UTC m=+20.737785672" watchObservedRunningTime="2026-04-16 18:03:08.733987855 +0000 UTC m=+20.738155863" Apr 16 18:03:09.232542 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:09.232518 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:03:09.541476 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:09.541371 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:03:09.232538121Z","UUID":"a6fb5d60-81e5-4c4c-8216-59110a3faf81","Handler":null,"Name":"","Endpoint":""} Apr 16 18:03:09.544588 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:09.544565 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:03:09.544714 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:09.544595 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:03:09.666478 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:09.666427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2bz8t" event={"ID":"a0b63c55-85f3-4126-9cbf-dac101325a0b","Type":"ContainerStarted","Data":"8eb96584d894645fe50282241ec921a037a7042aadce4acb3d22928bdb1f1ee9"} Apr 16 18:03:09.668361 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:09.668332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" event={"ID":"8c94f1e2-89d6-435e-884f-a0a41da4b42f","Type":"ContainerStarted","Data":"ec4f608954a9e5e936cfc8dceda5cd54d864cb6bacdfbfae53323d481bf6a5e1"} Apr 16 18:03:09.679826 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:09.679778 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2bz8t" podStartSLOduration=4.890886348 podStartE2EDuration="21.679765796s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.12200368 +0000 UTC m=+3.126171669" lastFinishedPulling="2026-04-16 18:03:07.910883129 +0000 UTC m=+19.915051117" observedRunningTime="2026-04-16 18:03:09.679660852 +0000 UTC m=+21.683828858" watchObservedRunningTime="2026-04-16 18:03:09.679765796 +0000 UTC m=+21.683933803" Apr 16 18:03:10.337276 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.337069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:10.337395 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:10.337221 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:03:10.337395 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:10.337342 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret podName:6fd513cc-2c53-4020-94b3-faf51a11b03f nodeName:}" failed. No retries permitted until 2026-04-16 18:03:26.337322483 +0000 UTC m=+38.341490470 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret") pod "global-pull-secret-syncer-28sv4" (UID: "6fd513cc-2c53-4020-94b3-faf51a11b03f") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:03:10.557416 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.557240 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:10.557416 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.557243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:10.557416 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:10.557353 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:10.557646 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:10.557488 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:10.557646 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.557530 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:10.557646 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:10.557583 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:10.672257 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.672213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" event={"ID":"8c94f1e2-89d6-435e-884f-a0a41da4b42f","Type":"ContainerStarted","Data":"2d1c1c5cd3f74dc44aec48900109681681135a58d065c3f614af79263c066b95"} Apr 16 18:03:10.675010 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.674981 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:03:10.675492 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.675468 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"13f3cd685ccfc4257426ed9def64489aadc8e008bbe25b5629d9a06f598f2326"} Apr 16 18:03:10.712454 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:10.712412 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kpvvv" podStartSLOduration=3.63803901 podStartE2EDuration="22.712398923s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.130059897 +0000 UTC m=+3.134227889" lastFinishedPulling="2026-04-16 18:03:10.204419813 +0000 UTC m=+22.208587802" observedRunningTime="2026-04-16 18:03:10.71221392 +0000 UTC m=+22.716381930" watchObservedRunningTime="2026-04-16 18:03:10.712398923 +0000 UTC m=+22.716566929" Apr 16 18:03:12.179341 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:12.179307 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:03:12.180006 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:12.179990 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:03:12.557295 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:12.557226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:12.557295 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:12.557249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:12.557479 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:12.557226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:12.557479 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:12.557346 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:12.557479 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:12.557406 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:12.557571 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:12.557525 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:13.514175 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.513926 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:03:13.514705 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.514504 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zlrpl" Apr 16 18:03:13.682972 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.682946 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:03:13.683281 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.683259 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"9bfb740b738ff9794fc165c7daf690aee4bbe0d9861c902558b83f3c3269cb52"} Apr 16 18:03:13.683544 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.683512 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:03:13.683623 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.683555 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:03:13.683740 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.683724 2578 scope.go:117] "RemoveContainer" containerID="f3cd45913950cbdaa457d4f4444e115c01ee8d7a5af81c345d063d24cfb87c79" Apr 16 18:03:13.685129 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.685106 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ee28cfd-b76c-488a-8374-405ee3a9a635" containerID="436bcbe6f105c3eb06cd9b73345039d1fc12e0a8bfe42d08f1f99c6132829424" exitCode=0 Apr 16 18:03:13.685241 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.685185 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerDied","Data":"436bcbe6f105c3eb06cd9b73345039d1fc12e0a8bfe42d08f1f99c6132829424"} Apr 16 18:03:13.699010 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:13.698991 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:03:14.557866 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.557839 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:14.557866 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.557861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:14.558401 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.557878 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:14.558401 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:14.557966 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:14.558401 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:14.558072 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:14.558401 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:14.558149 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:14.689696 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.689676 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:03:14.690011 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.689985 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" event={"ID":"b3ae8fd0-5875-4adc-8994-9c74852c6397","Type":"ContainerStarted","Data":"960f793dea4125cb32c58af6e8eb00fb435aa0cf728bd5f7098c8f863b769059"} Apr 16 18:03:14.690561 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.690535 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:03:14.706874 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.706853 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:03:14.709321 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.709301 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6tmgb"] Apr 16 18:03:14.709426 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.709368 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:14.709473 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:14.709457 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:14.713511 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.713489 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-28sv4"] Apr 16 18:03:14.713597 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.713568 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:14.713673 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:14.713657 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:14.718709 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.718685 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ndzmp"] Apr 16 18:03:14.718804 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.718766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:14.718898 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:14.718873 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:14.728567 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:14.728530 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" podStartSLOduration=9.871465477 podStartE2EDuration="26.728519057s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.131765673 +0000 UTC m=+3.135933658" lastFinishedPulling="2026-04-16 18:03:07.98881924 +0000 UTC m=+19.992987238" observedRunningTime="2026-04-16 18:03:14.728206274 +0000 UTC m=+26.732374283" watchObservedRunningTime="2026-04-16 18:03:14.728519057 +0000 UTC m=+26.732687089" Apr 16 18:03:15.693986 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:15.693958 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ee28cfd-b76c-488a-8374-405ee3a9a635" containerID="3977c6ea16d3cd2b81ef7832f5958265f63198cd4a6c802933b5d3f041527c16" exitCode=0 Apr 16 18:03:15.694429 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:15.694028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerDied","Data":"3977c6ea16d3cd2b81ef7832f5958265f63198cd4a6c802933b5d3f041527c16"} Apr 16 18:03:16.557845 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:16.557797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:16.558011 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:16.557797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:16.558011 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:16.557921 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:16.558011 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:16.557984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:16.558201 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:16.558132 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:16.558264 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:16.558237 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:17.699263 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:17.699047 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ee28cfd-b76c-488a-8374-405ee3a9a635" containerID="357b3ff8c0357979584b43f77df61fb5df94ee709384a7991fc18614be20af02" exitCode=0 Apr 16 18:03:17.699263 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:17.699128 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerDied","Data":"357b3ff8c0357979584b43f77df61fb5df94ee709384a7991fc18614be20af02"} Apr 16 18:03:18.558059 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:18.558033 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:18.558269 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:18.558141 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:18.558269 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:18.558227 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:18.558394 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:18.558334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:18.558394 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:18.558362 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:18.558497 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:18.558414 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:20.558053 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.558021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:20.558732 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.558083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:20.558732 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.558045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:20.558732 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:20.558242 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndzmp" podUID="2f073ea3-db3b-4eaa-9a74-db58c9d97b21" Apr 16 18:03:20.558732 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:20.558319 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6tmgb" podUID="a336e08f-92e1-4f5f-99d6-9f8231b01727" Apr 16 18:03:20.558732 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:20.558412 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28sv4" podUID="6fd513cc-2c53-4020-94b3-faf51a11b03f" Apr 16 18:03:20.868950 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.868924 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-167.ec2.internal" event="NodeReady" Apr 16 18:03:20.869138 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.869031 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:03:20.906578 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.906549 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm"] Apr 16 18:03:20.932276 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.931411 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff"] Apr 16 18:03:20.949451 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.949417 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-97bcf5c66-dbdrj"] Apr 16 18:03:20.949626 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.949532 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:20.949693 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.949630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" Apr 16 18:03:20.952890 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.952867 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:03:20.953049 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.952902 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:03:20.953771 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.953733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mhxp2\"" Apr 16 18:03:20.954210 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.953827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-s4v9v\"" Apr 16 18:03:20.954210 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.953742 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:03:20.954210 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.954003 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:20.954210 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.954069 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:03:20.954520 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.954487 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:03:20.964925 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.964903 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-cmsdt"] Apr 16 18:03:20.988295 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.988271 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-77bd68bc8b-btvvd"] Apr 16 18:03:20.988439 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.988392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:20.992266 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.992245 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:03:20.992363 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.992349 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:03:20.992672 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.992655 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:03:20.992755 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.992672 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:03:20.992755 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.992696 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:03:20.992755 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.992661 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sc59r\"" Apr 16 18:03:20.992923 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:20.992885 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:03:21.005006 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.004988 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj"] Apr 16 18:03:21.005130 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.005106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.005430 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.005415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.007793 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.007774 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.009716 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.009696 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:03:21.010312 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.009827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pv6n6\"" Apr 16 18:03:21.010312 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.009917 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:03:21.010312 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.010061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:03:21.010312 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.010146 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dnqxl\"" Apr 16 18:03:21.010312 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.010202 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.010589 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.010419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:03:21.010589 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.010432 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:03:21.016817 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.016792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.016910 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.016844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fadab673-8208-44dd-8dbd-4cfd3f66947b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.016910 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.016884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq756\" (UniqueName: \"kubernetes.io/projected/fadab673-8208-44dd-8dbd-4cfd3f66947b-kube-api-access-zq756\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.028868 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.028845 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:03:21.029087 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.029071 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4t7fq"] Apr 16 18:03:21.033406 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.033384 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:03:21.049821 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.049802 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm"] Apr 16 18:03:21.049898 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.049827 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff"] Apr 16 18:03:21.049898 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.049838 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-k687t"] Apr 16 18:03:21.049898 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.049859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:21.050002 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.049926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.052488 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.052468 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:03:21.052578 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.052509 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:03:21.052893 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.052873 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.053032 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.053017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.053100 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.053084 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:03:21.053229 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.053211 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-7cxr5\"" Apr 16 18:03:21.053229 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.053226 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.053347 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.053231 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6xbl7\"" Apr 16 18:03:21.053622 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.053607 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.060130 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.060112 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:03:21.068070 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.068053 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6"] Apr 16 18:03:21.068171 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.068069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" Apr 16 18:03:21.070863 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.070848 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mxcz9\"" Apr 16 18:03:21.071110 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.071098 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.071494 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.071479 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.088800 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.088772 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj"] Apr 16 18:03:21.088800 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.088795 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg"] Apr 16 18:03:21.088949 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.088911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.091791 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.091773 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-b6drq\"" Apr 16 18:03:21.091875 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.091862 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.091929 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.091899 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.092004 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.091990 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:03:21.092046 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.092019 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:03:21.109350 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.109332 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs"] Apr 16 18:03:21.109472 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.109458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.112087 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.112069 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.112244 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.112231 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:03:21.112318 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.112257 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:03:21.112365 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.112335 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-sc6r5\"" Apr 16 18:03:21.112451 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.112440 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.117610 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b3a3bd-16e5-4438-837b-7f24def37fc3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.117699 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b3a3bd-16e5-4438-837b-7f24def37fc3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.117699 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117638 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:21.117699 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117655 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-444nz\" (UniqueName: \"kubernetes.io/projected/e5b3a3bd-16e5-4438-837b-7f24def37fc3-kube-api-access-444nz\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.117699 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-default-certificate\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.117863 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-stats-auth\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.117863 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq756\" (UniqueName: \"kubernetes.io/projected/fadab673-8208-44dd-8dbd-4cfd3f66947b-kube-api-access-zq756\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.117863 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwtv\" (UniqueName: \"kubernetes.io/projected/6509e47c-1e65-4e88-b44a-91bf5ba93351-kube-api-access-6gwtv\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:21.117863 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635e7d14-be03-47cc-ba03-fc37558d4103-ca-trust-extracted\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-installation-pull-secrets\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-trusted-ca\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6zf\" (UniqueName: \"kubernetes.io/projected/9dd185dc-97ca-4edf-a263-a8115564eb69-kube-api-access-hw6zf\") pod \"volume-data-source-validator-7d955d5dd4-xc7ff\" (UID: \"9dd185dc-97ca-4edf-a263-a8115564eb69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" Apr 16 18:03:21.118045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.117965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.118045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8lfs\" (UniqueName: \"kubernetes.io/projected/76309363-7d66-4131-9509-98c2fcc90649-kube-api-access-f8lfs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.118045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-image-registry-private-configuration\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-registry-certificates\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118099 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-bound-sa-token\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t6ds\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-kube-api-access-9t6ds\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e5b3a3bd-16e5-4438-837b-7f24def37fc3-snapshots\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5b3a3bd-16e5-4438-837b-7f24def37fc3-tmp\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.118342 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5b3a3bd-16e5-4438-837b-7f24def37fc3-serving-cert\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.118747 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.118369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fadab673-8208-44dd-8dbd-4cfd3f66947b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.118747 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.118383 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:21.118747 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.118424 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls podName:fadab673-8208-44dd-8dbd-4cfd3f66947b nodeName:}" failed. No retries permitted until 2026-04-16 18:03:21.618411712 +0000 UTC m=+33.622579696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-464rm" (UID: "fadab673-8208-44dd-8dbd-4cfd3f66947b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:21.119082 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.119038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fadab673-8208-44dd-8dbd-4cfd3f66947b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.126542 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.126525 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5"] Apr 16 18:03:21.126679 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.126663 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.129210 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.129191 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:03:21.129293 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.129205 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:03:21.129492 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.129473 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:03:21.129581 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.129500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:03:21.129581 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.129506 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.129581 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.129514 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:03:21.129581 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.129537 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.132669 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.132642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq756\" (UniqueName: \"kubernetes.io/projected/fadab673-8208-44dd-8dbd-4cfd3f66947b-kube-api-access-zq756\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.142513 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.142492 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj"] Apr 16 18:03:21.142650 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.142634 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.146965 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.146947 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:03:21.147305 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.147290 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-42p57\"" Apr 16 18:03:21.153199 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.153178 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hddck"] Apr 16 18:03:21.153303 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.153270 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.155747 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.155728 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:03:21.167471 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167451 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-97bcf5c66-dbdrj"] Apr 16 18:03:21.167471 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167469 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77bd68bc8b-btvvd"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167479 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4t7fq"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167486 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-k687t"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167504 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-cmsdt"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167511 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hddck"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167518 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167525 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167532 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167538 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5"] Apr 16 18:03:21.167576 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:21.167999 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.167552 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-c6tf9"] Apr 16 18:03:21.170262 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.170243 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.170368 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.170300 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:03:21.170524 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.170503 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkw8n\"" Apr 16 18:03:21.170524 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.170519 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.184775 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.184757 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6tf9"] Apr 16 18:03:21.184905 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.184887 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.187457 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.187440 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:03:21.187538 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.187475 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:03:21.187538 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.187516 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzqks\"" Apr 16 18:03:21.187631 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.187595 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.187673 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.187648 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.219003 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.218985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldfgl\" (UniqueName: \"kubernetes.io/projected/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-kube-api-access-ldfgl\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.219118 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkszl\" (UniqueName: \"kubernetes.io/projected/1581c4ae-cdee-4c3c-8bb7-c74cf465de66-kube-api-access-zkszl\") pod \"network-check-source-7b678d77c7-k687t\" (UID: \"1581c4ae-cdee-4c3c-8bb7-c74cf465de66\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" Apr 16 18:03:21.219118 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-trusted-ca\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.219118 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t6ds\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-kube-api-access-9t6ds\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.219301 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.219301 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d8511e45-12bd-403d-adca-66af780a5704-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.219301 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz8s7\" (UniqueName: \"kubernetes.io/projected/d8511e45-12bd-403d-adca-66af780a5704-kube-api-access-jz8s7\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.219301 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.219238 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:03:21.219457 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.219311 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:21.719291237 +0000 UTC m=+33.723459234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : secret "router-metrics-certs-default" not found Apr 16 18:03:21.219457 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8ed6c21-e3ad-417e-8b73-a21ef70ba241-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68678d9669-2vff5\" (UID: \"f8ed6c21-e3ad-417e-8b73-a21ef70ba241\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.219457 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s4xg\" (UniqueName: \"kubernetes.io/projected/4a741170-26d5-4d55-bd37-3e869323218c-kube-api-access-5s4xg\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.219457 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.219653 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e5b3a3bd-16e5-4438-837b-7f24def37fc3-snapshots\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.219653 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.219653 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-serving-cert\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.219653 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.219623 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:21.219653 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.219638 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bd68bc8b-btvvd: secret "image-registry-tls" not found Apr 16 18:03:21.219859 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5b3a3bd-16e5-4438-837b-7f24def37fc3-tmp\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.219859 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.219690 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls podName:635e7d14-be03-47cc-ba03-fc37558d4103 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:21.719674622 +0000 UTC m=+33.723842611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls") pod "image-registry-77bd68bc8b-btvvd" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103") : secret "image-registry-tls" not found Apr 16 18:03:21.219859 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5b3a3bd-16e5-4438-837b-7f24def37fc3-serving-cert\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.219859 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635e7d14-be03-47cc-ba03-fc37558d4103-ca-trust-extracted\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.219859 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-installation-pull-secrets\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.219859 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b3a3bd-16e5-4438-837b-7f24def37fc3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.219859 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.219840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b3a3bd-16e5-4438-837b-7f24def37fc3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.220222 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635e7d14-be03-47cc-ba03-fc37558d4103-ca-trust-extracted\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.220278 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f837e66e-5440-4461-ae9b-0bff515a395f-config\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.220338 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:21.220338 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwtv\" (UniqueName: \"kubernetes.io/projected/6509e47c-1e65-4e88-b44a-91bf5ba93351-kube-api-access-6gwtv\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:21.220438 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb29g\" (UniqueName: \"kubernetes.io/projected/f8ed6c21-e3ad-417e-8b73-a21ef70ba241-kube-api-access-jb29g\") pod \"managed-serviceaccount-addon-agent-68678d9669-2vff5\" (UID: \"f8ed6c21-e3ad-417e-8b73-a21ef70ba241\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.220438 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsntj\" (UniqueName: \"kubernetes.io/projected/f837e66e-5440-4461-ae9b-0bff515a395f-kube-api-access-qsntj\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.220438 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-hub\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.220583 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a741170-26d5-4d55-bd37-3e869323218c-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.220583 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-image-registry-private-configuration\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.220583 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f837e66e-5440-4461-ae9b-0bff515a395f-serving-cert\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.220583 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b3a3bd-16e5-4438-837b-7f24def37fc3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.220583 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-config\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-registry-certificates\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-bound-sa-token\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.220650 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-ca\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.220719 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls podName:6509e47c-1e65-4e88-b44a-91bf5ba93351 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:21.720680933 +0000 UTC m=+33.724848923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls") pod "cluster-samples-operator-667775844f-xsspj" (UID: "6509e47c-1e65-4e88-b44a-91bf5ba93351") : secret "samples-operator-tls" not found Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-trusted-ca\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.220828 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b3a3bd-16e5-4438-837b-7f24def37fc3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8lfs\" (UniqueName: \"kubernetes.io/projected/76309363-7d66-4131-9509-98c2fcc90649-kube-api-access-f8lfs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a741170-26d5-4d55-bd37-3e869323218c-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-444nz\" (UniqueName: \"kubernetes.io/projected/e5b3a3bd-16e5-4438-837b-7f24def37fc3-kube-api-access-444nz\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-default-certificate\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-stats-auth\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.220988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6zf\" (UniqueName: \"kubernetes.io/projected/9dd185dc-97ca-4edf-a263-a8115564eb69-kube-api-access-hw6zf\") pod \"volume-data-source-validator-7d955d5dd4-xc7ff\" (UID: \"9dd185dc-97ca-4edf-a263-a8115564eb69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.221012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.221172 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:21.721140898 +0000 UTC m=+33.725308908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : configmap references non-existent config key: service-ca.crt Apr 16 18:03:21.221218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.221175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-registry-certificates\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.222117 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.222043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-trusted-ca\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.222117 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.222071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e5b3a3bd-16e5-4438-837b-7f24def37fc3-snapshots\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.222614 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.222592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5b3a3bd-16e5-4438-837b-7f24def37fc3-serving-cert\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.222723 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.222588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5b3a3bd-16e5-4438-837b-7f24def37fc3-tmp\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.222993 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.222952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-installation-pull-secrets\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.224262 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.224222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-default-certificate\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.224873 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.224850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-image-registry-private-configuration\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.226369 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.226348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-stats-auth\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.230766 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.230748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t6ds\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-kube-api-access-9t6ds\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.231798 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.231760 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6zf\" (UniqueName: \"kubernetes.io/projected/9dd185dc-97ca-4edf-a263-a8115564eb69-kube-api-access-hw6zf\") pod \"volume-data-source-validator-7d955d5dd4-xc7ff\" (UID: \"9dd185dc-97ca-4edf-a263-a8115564eb69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" Apr 16 18:03:21.233056 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.233031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8lfs\" (UniqueName: \"kubernetes.io/projected/76309363-7d66-4131-9509-98c2fcc90649-kube-api-access-f8lfs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.233414 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.233391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-bound-sa-token\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.233676 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.233656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwtv\" (UniqueName: \"kubernetes.io/projected/6509e47c-1e65-4e88-b44a-91bf5ba93351-kube-api-access-6gwtv\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:21.233862 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.233844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-444nz\" (UniqueName: \"kubernetes.io/projected/e5b3a3bd-16e5-4438-837b-7f24def37fc3-kube-api-access-444nz\") pod \"insights-operator-5785d4fcdd-cmsdt\" (UID: \"e5b3a3bd-16e5-4438-837b-7f24def37fc3\") " pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.273485 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.273458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" Apr 16 18:03:21.318767 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.318741 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" Apr 16 18:03:21.321674 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.321650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-serving-cert\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.321775 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.321701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-config-volume\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.321775 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.321732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzkbp\" (UniqueName: \"kubernetes.io/projected/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-kube-api-access-tzkbp\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.321853 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.321773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f837e66e-5440-4461-ae9b-0bff515a395f-config\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.321884 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.321842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8e03b2b2-a432-4859-bef2-74d7f9646342-klusterlet-config\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.321917 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.321891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jb29g\" (UniqueName: \"kubernetes.io/projected/f8ed6c21-e3ad-417e-8b73-a21ef70ba241-kube-api-access-jb29g\") pod \"managed-serviceaccount-addon-agent-68678d9669-2vff5\" (UID: \"f8ed6c21-e3ad-417e-8b73-a21ef70ba241\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.322024 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322002 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxgrg\" (UniqueName: \"kubernetes.io/projected/85ac3502-8838-441d-985c-4dd2dc6e803c-kube-api-access-dxgrg\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:21.322135 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsntj\" (UniqueName: \"kubernetes.io/projected/f837e66e-5440-4461-ae9b-0bff515a395f-kube-api-access-qsntj\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.322221 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-hub\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.322331 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a741170-26d5-4d55-bd37-3e869323218c-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.322441 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f837e66e-5440-4461-ae9b-0bff515a395f-serving-cert\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.322441 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-config\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.322441 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.322600 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-ca\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.322600 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.322600 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e03b2b2-a432-4859-bef2-74d7f9646342-tmp\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.322600 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-tmp-dir\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.322600 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a741170-26d5-4d55-bd37-3e869323218c-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.322833 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8rft\" (UniqueName: \"kubernetes.io/projected/8e03b2b2-a432-4859-bef2-74d7f9646342-kube-api-access-k8rft\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.322833 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:21.322833 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldfgl\" (UniqueName: \"kubernetes.io/projected/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-kube-api-access-ldfgl\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.322833 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkszl\" (UniqueName: \"kubernetes.io/projected/1581c4ae-cdee-4c3c-8bb7-c74cf465de66-kube-api-access-zkszl\") pod \"network-check-source-7b678d77c7-k687t\" (UID: \"1581c4ae-cdee-4c3c-8bb7-c74cf465de66\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" Apr 16 18:03:21.322833 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-trusted-ca\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.322833 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d8511e45-12bd-403d-adca-66af780a5704-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.323184 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz8s7\" (UniqueName: \"kubernetes.io/projected/d8511e45-12bd-403d-adca-66af780a5704-kube-api-access-jz8s7\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.323184 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8ed6c21-e3ad-417e-8b73-a21ef70ba241-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68678d9669-2vff5\" (UID: \"f8ed6c21-e3ad-417e-8b73-a21ef70ba241\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.323184 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s4xg\" (UniqueName: \"kubernetes.io/projected/4a741170-26d5-4d55-bd37-3e869323218c-kube-api-access-5s4xg\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.323184 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.322921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.323184 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.323069 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-config\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.323965 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.323933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-trusted-ca\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.325010 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.324410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-serving-cert\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.325010 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.324451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a741170-26d5-4d55-bd37-3e869323218c-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.325010 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.324968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d8511e45-12bd-403d-adca-66af780a5704-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.325227 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.325071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a741170-26d5-4d55-bd37-3e869323218c-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.326129 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.326101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8ed6c21-e3ad-417e-8b73-a21ef70ba241-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68678d9669-2vff5\" (UID: \"f8ed6c21-e3ad-417e-8b73-a21ef70ba241\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.326806 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.326766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-ca\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.326967 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.326941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f837e66e-5440-4461-ae9b-0bff515a395f-serving-cert\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.327184 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.327143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-hub\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.327754 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.327738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.327827 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.327803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d8511e45-12bd-403d-adca-66af780a5704-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.332374 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.332304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f837e66e-5440-4461-ae9b-0bff515a395f-config\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.333968 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.333921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb29g\" (UniqueName: \"kubernetes.io/projected/f8ed6c21-e3ad-417e-8b73-a21ef70ba241-kube-api-access-jb29g\") pod \"managed-serviceaccount-addon-agent-68678d9669-2vff5\" (UID: \"f8ed6c21-e3ad-417e-8b73-a21ef70ba241\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.334389 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.334369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsntj\" (UniqueName: \"kubernetes.io/projected/f837e66e-5440-4461-ae9b-0bff515a395f-kube-api-access-qsntj\") pod \"service-ca-operator-69965bb79d-fdmn6\" (UID: \"f837e66e-5440-4461-ae9b-0bff515a395f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.334492 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.334470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz8s7\" (UniqueName: \"kubernetes.io/projected/d8511e45-12bd-403d-adca-66af780a5704-kube-api-access-jz8s7\") pod \"cluster-proxy-proxy-agent-c998fdf48-zdgvs\" (UID: \"d8511e45-12bd-403d-adca-66af780a5704\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.334558 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.334518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldfgl\" (UniqueName: \"kubernetes.io/projected/bea2a2a8-4d21-4fd7-978c-0b7af8200cd7-kube-api-access-ldfgl\") pod \"console-operator-d87b8d5fc-4t7fq\" (UID: \"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.334784 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.334763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s4xg\" (UniqueName: \"kubernetes.io/projected/4a741170-26d5-4d55-bd37-3e869323218c-kube-api-access-5s4xg\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nqmqg\" (UID: \"4a741170-26d5-4d55-bd37-3e869323218c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.335032 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.335014 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkszl\" (UniqueName: \"kubernetes.io/projected/1581c4ae-cdee-4c3c-8bb7-c74cf465de66-kube-api-access-zkszl\") pod \"network-check-source-7b678d77c7-k687t\" (UID: \"1581c4ae-cdee-4c3c-8bb7-c74cf465de66\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" Apr 16 18:03:21.365468 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.365440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:21.371297 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.371247 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qnnb2"] Apr 16 18:03:21.376464 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.376440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" Apr 16 18:03:21.396621 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.396600 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" Apr 16 18:03:21.400054 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.400036 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.402688 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.402667 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6g7d8\"" Apr 16 18:03:21.416786 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.416767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" Apr 16 18:03:21.423639 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.423615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e03b2b2-a432-4859-bef2-74d7f9646342-tmp\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.423733 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.423652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-tmp-dir\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.423733 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.423701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8rft\" (UniqueName: \"kubernetes.io/projected/8e03b2b2-a432-4859-bef2-74d7f9646342-kube-api-access-k8rft\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.423733 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.423730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:21.423887 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.423832 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:21.423887 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.423872 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert podName:85ac3502-8838-441d-985c-4dd2dc6e803c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:21.923858839 +0000 UTC m=+33.928026824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert") pod "ingress-canary-hddck" (UID: "85ac3502-8838-441d-985c-4dd2dc6e803c") : secret "canary-serving-cert" not found Apr 16 18:03:21.424023 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e03b2b2-a432-4859-bef2-74d7f9646342-tmp\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.424073 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424039 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-tmp-dir\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.424073 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-config-volume\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.424171 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzkbp\" (UniqueName: \"kubernetes.io/projected/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-kube-api-access-tzkbp\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.424171 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8e03b2b2-a432-4859-bef2-74d7f9646342-klusterlet-config\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.424264 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxgrg\" (UniqueName: \"kubernetes.io/projected/85ac3502-8838-441d-985c-4dd2dc6e803c-kube-api-access-dxgrg\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:21.424301 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.424374 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.424362 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:21.424412 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.424398 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls podName:44d91f6d-be25-4f64-af47-6cdda8f2bfb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:21.924387867 +0000 UTC m=+33.928555851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls") pod "dns-default-c6tf9" (UID: "44d91f6d-be25-4f64-af47-6cdda8f2bfb6") : secret "dns-default-metrics-tls" not found Apr 16 18:03:21.424515 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.424494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-config-volume\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.426918 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.426895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8e03b2b2-a432-4859-bef2-74d7f9646342-klusterlet-config\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.433015 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.432994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxgrg\" (UniqueName: \"kubernetes.io/projected/85ac3502-8838-441d-985c-4dd2dc6e803c-kube-api-access-dxgrg\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:21.433692 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.433645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8rft\" (UniqueName: \"kubernetes.io/projected/8e03b2b2-a432-4859-bef2-74d7f9646342-kube-api-access-k8rft\") pod \"klusterlet-addon-workmgr-cf6d9648b-6vlhj\" (UID: \"8e03b2b2-a432-4859-bef2-74d7f9646342\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.434382 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.434360 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzkbp\" (UniqueName: \"kubernetes.io/projected/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-kube-api-access-tzkbp\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.452300 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.452280 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:03:21.460860 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.460842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" Apr 16 18:03:21.480558 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.480530 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:21.525855 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.525651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497ce92-67e0-497d-b2db-ccc3571b7753-hosts-file\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.525972 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.525957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtv9w\" (UniqueName: \"kubernetes.io/projected/6497ce92-67e0-497d-b2db-ccc3571b7753-kube-api-access-mtv9w\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.526046 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.526031 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6497ce92-67e0-497d-b2db-ccc3571b7753-tmp-dir\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.626537 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.626469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6497ce92-67e0-497d-b2db-ccc3571b7753-tmp-dir\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.627068 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.626561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497ce92-67e0-497d-b2db-ccc3571b7753-hosts-file\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.627068 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.626673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtv9w\" (UniqueName: \"kubernetes.io/projected/6497ce92-67e0-497d-b2db-ccc3571b7753-kube-api-access-mtv9w\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.627068 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.626700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497ce92-67e0-497d-b2db-ccc3571b7753-hosts-file\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.627068 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.626712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:21.627068 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.626829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6497ce92-67e0-497d-b2db-ccc3571b7753-tmp-dir\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.627068 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.626901 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:21.627068 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.626982 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls podName:fadab673-8208-44dd-8dbd-4cfd3f66947b nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.626962265 +0000 UTC m=+34.631130257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-464rm" (UID: "fadab673-8208-44dd-8dbd-4cfd3f66947b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:21.635269 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.635249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtv9w\" (UniqueName: \"kubernetes.io/projected/6497ce92-67e0-497d-b2db-ccc3571b7753-kube-api-access-mtv9w\") pod \"node-resolver-qnnb2\" (UID: \"6497ce92-67e0-497d-b2db-ccc3571b7753\") " pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.710788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.710760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnnb2" Apr 16 18:03:21.728094 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.728072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.728231 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.728114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:21.728231 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.728146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:21.728231 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.728211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:21.728391 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728246 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:03:21.728391 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728266 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.72824762 +0000 UTC m=+34.732415610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : configmap references non-existent config key: service-ca.crt Apr 16 18:03:21.728391 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728287 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.728277464 +0000 UTC m=+34.732445450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : secret "router-metrics-certs-default" not found Apr 16 18:03:21.728391 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728324 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:03:21.728391 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728385 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls podName:6509e47c-1e65-4e88-b44a-91bf5ba93351 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.728369171 +0000 UTC m=+34.732537157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls") pod "cluster-samples-operator-667775844f-xsspj" (UID: "6509e47c-1e65-4e88-b44a-91bf5ba93351") : secret "samples-operator-tls" not found Apr 16 18:03:21.728391 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728325 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:21.728639 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728407 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bd68bc8b-btvvd: secret "image-registry-tls" not found Apr 16 18:03:21.728639 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.728446 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls podName:635e7d14-be03-47cc-ba03-fc37558d4103 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.728435749 +0000 UTC m=+34.732603740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls") pod "image-registry-77bd68bc8b-btvvd" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103") : secret "image-registry-tls" not found Apr 16 18:03:21.930539 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.930444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:21.930691 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.930586 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:21.930691 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.930653 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert podName:85ac3502-8838-441d-985c-4dd2dc6e803c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.930635244 +0000 UTC m=+34.934803233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert") pod "ingress-canary-hddck" (UID: "85ac3502-8838-441d-985c-4dd2dc6e803c") : secret "canary-serving-cert" not found Apr 16 18:03:21.930788 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:21.930744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:21.930840 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.930833 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:21.930884 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:21.930876 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls podName:44d91f6d-be25-4f64-af47-6cdda8f2bfb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:22.930864916 +0000 UTC m=+34.935032903 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls") pod "dns-default-c6tf9" (UID: "44d91f6d-be25-4f64-af47-6cdda8f2bfb6") : secret "dns-default-metrics-tls" not found Apr 16 18:03:22.234058 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.233980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:22.234058 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.234024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:22.234302 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.234257 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:22.234360 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.234325 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs podName:2f073ea3-db3b-4eaa-9a74-db58c9d97b21 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:54.234310317 +0000 UTC m=+66.238478302 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs") pod "network-metrics-daemon-ndzmp" (UID: "2f073ea3-db3b-4eaa-9a74-db58c9d97b21") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:22.249547 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.249520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvb2\" (UniqueName: \"kubernetes.io/projected/a336e08f-92e1-4f5f-99d6-9f8231b01727-kube-api-access-ssvb2\") pod \"network-check-target-6tmgb\" (UID: \"a336e08f-92e1-4f5f-99d6-9f8231b01727\") " pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:22.557904 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.557810 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:22.557904 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.557841 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:22.557904 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.557856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:22.560783 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.560756 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:22.560920 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.560812 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z2hb6\"" Apr 16 18:03:22.561138 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.561121 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:03:22.561211 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.561148 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qtxn\"" Apr 16 18:03:22.575469 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.575453 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:22.637272 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.637241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:22.637621 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.637395 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:22.637621 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.637447 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls podName:fadab673-8208-44dd-8dbd-4cfd3f66947b nodeName:}" failed. No retries permitted until 2026-04-16 18:03:24.637433408 +0000 UTC m=+36.641601398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-464rm" (UID: "fadab673-8208-44dd-8dbd-4cfd3f66947b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:22.738171 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.738124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:22.738324 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.738189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:22.738324 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.738247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:22.738324 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738291 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:24.738268618 +0000 UTC m=+36.742436606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : configmap references non-existent config key: service-ca.crt Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738341 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.738374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738391 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:24.738376705 +0000 UTC m=+36.742544708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : secret "router-metrics-certs-default" not found Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738344 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738447 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bd68bc8b-btvvd: secret "image-registry-tls" not found Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738451 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738489 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls podName:635e7d14-be03-47cc-ba03-fc37558d4103 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:24.738478587 +0000 UTC m=+36.742646573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls") pod "image-registry-77bd68bc8b-btvvd" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103") : secret "image-registry-tls" not found Apr 16 18:03:22.738512 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.738508 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls podName:6509e47c-1e65-4e88-b44a-91bf5ba93351 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:24.738497589 +0000 UTC m=+36.742665587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls") pod "cluster-samples-operator-667775844f-xsspj" (UID: "6509e47c-1e65-4e88-b44a-91bf5ba93351") : secret "samples-operator-tls" not found Apr 16 18:03:22.939779 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.939745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:22.939943 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.939907 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:22.939993 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.939966 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert podName:85ac3502-8838-441d-985c-4dd2dc6e803c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:24.939951694 +0000 UTC m=+36.944119678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert") pod "ingress-canary-hddck" (UID: "85ac3502-8838-441d-985c-4dd2dc6e803c") : secret "canary-serving-cert" not found Apr 16 18:03:22.940047 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:22.940025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:22.940164 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.940135 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:22.940290 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:22.940196 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls podName:44d91f6d-be25-4f64-af47-6cdda8f2bfb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:24.940181706 +0000 UTC m=+36.944349694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls") pod "dns-default-c6tf9" (UID: "44d91f6d-be25-4f64-af47-6cdda8f2bfb6") : secret "dns-default-metrics-tls" not found Apr 16 18:03:23.357150 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.357124 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6497ce92_67e0_497d_b2db_ccc3571b7753.slice/crio-b45e3fcd9c8cbc319328c1485911810958f559e341af8d148ab94a2c33912414 WatchSource:0}: Error finding container b45e3fcd9c8cbc319328c1485911810958f559e341af8d148ab94a2c33912414: Status 404 returned error can't find the container with id b45e3fcd9c8cbc319328c1485911810958f559e341af8d148ab94a2c33912414 Apr 16 18:03:23.655655 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.653626 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5"] Apr 16 18:03:23.664026 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.663772 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-k687t"] Apr 16 18:03:23.677922 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.677888 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs"] Apr 16 18:03:23.679591 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.679565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff"] Apr 16 18:03:23.690526 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.690356 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6"] Apr 16 18:03:23.695700 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.695639 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-cmsdt"] Apr 16 18:03:23.703543 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.703397 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj"] Apr 16 18:03:23.706620 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.706601 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg"] Apr 16 18:03:23.709687 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.709668 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4t7fq"] Apr 16 18:03:23.711424 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.711385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnnb2" event={"ID":"6497ce92-67e0-497d-b2db-ccc3571b7753","Type":"ContainerStarted","Data":"da4edb00d0f448fb7d735e7378cb7d2a1f3d10b10a8c75abc45301872765ed23"} Apr 16 18:03:23.711510 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.711428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnnb2" event={"ID":"6497ce92-67e0-497d-b2db-ccc3571b7753","Type":"ContainerStarted","Data":"b45e3fcd9c8cbc319328c1485911810958f559e341af8d148ab94a2c33912414"} Apr 16 18:03:23.712452 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:23.712435 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6tmgb"] Apr 16 18:03:23.722468 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.722438 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8ed6c21_e3ad_417e_8b73_a21ef70ba241.slice/crio-7a635c89d462d9db953f4516e6ff5e43525a1d288dd2ab49b017b4c6023a5258 WatchSource:0}: Error finding container 7a635c89d462d9db953f4516e6ff5e43525a1d288dd2ab49b017b4c6023a5258: Status 404 returned error can't find the container with id 7a635c89d462d9db953f4516e6ff5e43525a1d288dd2ab49b017b4c6023a5258 Apr 16 18:03:23.722801 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.722767 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1581c4ae_cdee_4c3c_8bb7_c74cf465de66.slice/crio-2ce2ed62afd22f301dcca768d764b63cc32d01aff4ce9069ac27b94ed7611b00 WatchSource:0}: Error finding container 2ce2ed62afd22f301dcca768d764b63cc32d01aff4ce9069ac27b94ed7611b00: Status 404 returned error can't find the container with id 2ce2ed62afd22f301dcca768d764b63cc32d01aff4ce9069ac27b94ed7611b00 Apr 16 18:03:23.723446 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.723424 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8511e45_12bd_403d_adca_66af780a5704.slice/crio-221d14c9e147717536129e4f05b5a949a7005754c7e37316d91f04f0a05b09e3 WatchSource:0}: Error finding container 221d14c9e147717536129e4f05b5a949a7005754c7e37316d91f04f0a05b09e3: Status 404 returned error can't find the container with id 221d14c9e147717536129e4f05b5a949a7005754c7e37316d91f04f0a05b09e3 Apr 16 18:03:23.724392 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.724363 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd185dc_97ca_4edf_a263_a8115564eb69.slice/crio-8f29afbb0f17f046a1ad15e24f3306c31e68ca4259daac3d08fd10bc0ac0dede WatchSource:0}: Error finding container 8f29afbb0f17f046a1ad15e24f3306c31e68ca4259daac3d08fd10bc0ac0dede: Status 404 returned error can't find the container with id 8f29afbb0f17f046a1ad15e24f3306c31e68ca4259daac3d08fd10bc0ac0dede Apr 16 18:03:23.726367 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.725764 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf837e66e_5440_4461_ae9b_0bff515a395f.slice/crio-c54717284aa675b8738ca46a938e7dcae15e0723e7c06e2e223d19ce17b71225 WatchSource:0}: Error finding container c54717284aa675b8738ca46a938e7dcae15e0723e7c06e2e223d19ce17b71225: Status 404 returned error can't find the container with id c54717284aa675b8738ca46a938e7dcae15e0723e7c06e2e223d19ce17b71225 Apr 16 18:03:23.726542 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.726422 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5b3a3bd_16e5_4438_837b_7f24def37fc3.slice/crio-ccca4fdf8c8100f7ff5ceb1277513f2f2df9798adf2f55fadc7b9ce789791101 WatchSource:0}: Error finding container ccca4fdf8c8100f7ff5ceb1277513f2f2df9798adf2f55fadc7b9ce789791101: Status 404 returned error can't find the container with id ccca4fdf8c8100f7ff5ceb1277513f2f2df9798adf2f55fadc7b9ce789791101 Apr 16 18:03:23.739052 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.739032 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e03b2b2_a432_4859_bef2_74d7f9646342.slice/crio-9273ced9592f927516ac4758bf68f3c2e42fb95b401ec1b301e86c44de973deb WatchSource:0}: Error finding container 9273ced9592f927516ac4758bf68f3c2e42fb95b401ec1b301e86c44de973deb: Status 404 returned error can't find the container with id 9273ced9592f927516ac4758bf68f3c2e42fb95b401ec1b301e86c44de973deb Apr 16 18:03:23.740009 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.739985 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a741170_26d5_4d55_bd37_3e869323218c.slice/crio-b85d9d2146153db7e062c16cb3d21613b9d8063b12917752aef5f4d631523b17 WatchSource:0}: Error finding container b85d9d2146153db7e062c16cb3d21613b9d8063b12917752aef5f4d631523b17: Status 404 returned error can't find the container with id b85d9d2146153db7e062c16cb3d21613b9d8063b12917752aef5f4d631523b17 Apr 16 18:03:23.741273 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.741173 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea2a2a8_4d21_4fd7_978c_0b7af8200cd7.slice/crio-abacbb7ba3626f43a627039c151c98c85619b555edf05636e584bf0659c15479 WatchSource:0}: Error finding container abacbb7ba3626f43a627039c151c98c85619b555edf05636e584bf0659c15479: Status 404 returned error can't find the container with id abacbb7ba3626f43a627039c151c98c85619b555edf05636e584bf0659c15479 Apr 16 18:03:23.741952 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:23.741886 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda336e08f_92e1_4f5f_99d6_9f8231b01727.slice/crio-d8127aa12743fe90a43fb0b214fbfac222442d15a5472d3ec7135f3e5d2fc108 WatchSource:0}: Error finding container d8127aa12743fe90a43fb0b214fbfac222442d15a5472d3ec7135f3e5d2fc108: Status 404 returned error can't find the container with id d8127aa12743fe90a43fb0b214fbfac222442d15a5472d3ec7135f3e5d2fc108 Apr 16 18:03:24.655101 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.655062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:24.655308 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.655284 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:24.655382 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.655354 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls podName:fadab673-8208-44dd-8dbd-4cfd3f66947b nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.65533382 +0000 UTC m=+40.659501820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-464rm" (UID: "fadab673-8208-44dd-8dbd-4cfd3f66947b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:24.735737 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.735701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6tmgb" event={"ID":"a336e08f-92e1-4f5f-99d6-9f8231b01727","Type":"ContainerStarted","Data":"d8127aa12743fe90a43fb0b214fbfac222442d15a5472d3ec7135f3e5d2fc108"} Apr 16 18:03:24.750320 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.750250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" event={"ID":"f837e66e-5440-4461-ae9b-0bff515a395f","Type":"ContainerStarted","Data":"c54717284aa675b8738ca46a938e7dcae15e0723e7c06e2e223d19ce17b71225"} Apr 16 18:03:24.755958 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.755929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:24.756086 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.755982 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:24.756086 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.756035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:24.756227 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.756182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:24.756356 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756325 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.756306228 +0000 UTC m=+40.760474217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : configmap references non-existent config key: service-ca.crt Apr 16 18:03:24.756731 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756712 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:03:24.756823 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756765 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.756750965 +0000 UTC m=+40.760918950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : secret "router-metrics-certs-default" not found Apr 16 18:03:24.756884 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756833 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:24.756884 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756845 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bd68bc8b-btvvd: secret "image-registry-tls" not found Apr 16 18:03:24.756884 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756878 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls podName:635e7d14-be03-47cc-ba03-fc37558d4103 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.756867031 +0000 UTC m=+40.761035019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls") pod "image-registry-77bd68bc8b-btvvd" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103") : secret "image-registry-tls" not found Apr 16 18:03:24.757027 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756940 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:03:24.757027 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.756969 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls podName:6509e47c-1e65-4e88-b44a-91bf5ba93351 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.756959881 +0000 UTC m=+40.761127865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls") pod "cluster-samples-operator-667775844f-xsspj" (UID: "6509e47c-1e65-4e88-b44a-91bf5ba93351") : secret "samples-operator-tls" not found Apr 16 18:03:24.763137 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.763102 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" event={"ID":"1581c4ae-cdee-4c3c-8bb7-c74cf465de66","Type":"ContainerStarted","Data":"2ce2ed62afd22f301dcca768d764b63cc32d01aff4ce9069ac27b94ed7611b00"} Apr 16 18:03:24.773084 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.773008 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" event={"ID":"f8ed6c21-e3ad-417e-8b73-a21ef70ba241","Type":"ContainerStarted","Data":"7a635c89d462d9db953f4516e6ff5e43525a1d288dd2ab49b017b4c6023a5258"} Apr 16 18:03:24.790045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.790018 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ee28cfd-b76c-488a-8374-405ee3a9a635" containerID="2ff62f626f0b093f1f828950fcf7bf9a3ca1d97d5f7207232d2417d7d0ba4fe5" exitCode=0 Apr 16 18:03:24.790176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.790088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerDied","Data":"2ff62f626f0b093f1f828950fcf7bf9a3ca1d97d5f7207232d2417d7d0ba4fe5"} Apr 16 18:03:24.809102 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.809048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" event={"ID":"4a741170-26d5-4d55-bd37-3e869323218c","Type":"ContainerStarted","Data":"b85d9d2146153db7e062c16cb3d21613b9d8063b12917752aef5f4d631523b17"} Apr 16 18:03:24.812771 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.812698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" event={"ID":"8e03b2b2-a432-4859-bef2-74d7f9646342","Type":"ContainerStarted","Data":"9273ced9592f927516ac4758bf68f3c2e42fb95b401ec1b301e86c44de973deb"} Apr 16 18:03:24.816104 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.816064 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" event={"ID":"d8511e45-12bd-403d-adca-66af780a5704","Type":"ContainerStarted","Data":"221d14c9e147717536129e4f05b5a949a7005754c7e37316d91f04f0a05b09e3"} Apr 16 18:03:24.823902 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.823876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" event={"ID":"e5b3a3bd-16e5-4438-837b-7f24def37fc3","Type":"ContainerStarted","Data":"ccca4fdf8c8100f7ff5ceb1277513f2f2df9798adf2f55fadc7b9ce789791101"} Apr 16 18:03:24.836938 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.836796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" event={"ID":"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7","Type":"ContainerStarted","Data":"abacbb7ba3626f43a627039c151c98c85619b555edf05636e584bf0659c15479"} Apr 16 18:03:24.861094 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.861053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" event={"ID":"9dd185dc-97ca-4edf-a263-a8115564eb69","Type":"ContainerStarted","Data":"8f29afbb0f17f046a1ad15e24f3306c31e68ca4259daac3d08fd10bc0ac0dede"} Apr 16 18:03:24.964972 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.962114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:24.964972 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:24.962420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:24.964972 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.962636 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:24.964972 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.962696 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls podName:44d91f6d-be25-4f64-af47-6cdda8f2bfb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.962676729 +0000 UTC m=+40.966844719 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls") pod "dns-default-c6tf9" (UID: "44d91f6d-be25-4f64-af47-6cdda8f2bfb6") : secret "dns-default-metrics-tls" not found Apr 16 18:03:24.964972 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.963754 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:24.964972 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:24.963799 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert podName:85ac3502-8838-441d-985c-4dd2dc6e803c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.963784529 +0000 UTC m=+40.967952515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert") pod "ingress-canary-hddck" (UID: "85ac3502-8838-441d-985c-4dd2dc6e803c") : secret "canary-serving-cert" not found Apr 16 18:03:25.901234 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:25.900428 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ee28cfd-b76c-488a-8374-405ee3a9a635" containerID="e34abe7cef12610815816009dd3d3ee16ee7a5994e75dadf743de1c0b6956443" exitCode=0 Apr 16 18:03:25.901234 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:25.900495 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerDied","Data":"e34abe7cef12610815816009dd3d3ee16ee7a5994e75dadf743de1c0b6956443"} Apr 16 18:03:25.953710 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:25.953148 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qnnb2" podStartSLOduration=4.953128296 podStartE2EDuration="4.953128296s" podCreationTimestamp="2026-04-16 18:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:24.882880468 +0000 UTC m=+36.887048488" watchObservedRunningTime="2026-04-16 18:03:25.953128296 +0000 UTC m=+37.957296288" Apr 16 18:03:26.381176 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:26.378625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:26.393340 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:26.393310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fd513cc-2c53-4020-94b3-faf51a11b03f-original-pull-secret\") pod \"global-pull-secret-syncer-28sv4\" (UID: \"6fd513cc-2c53-4020-94b3-faf51a11b03f\") " pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:26.469845 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:26.469803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28sv4" Apr 16 18:03:28.705548 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:28.705516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:28.705962 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.705643 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:28.705962 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.705727 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls podName:fadab673-8208-44dd-8dbd-4cfd3f66947b nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.705707326 +0000 UTC m=+48.709875325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-464rm" (UID: "fadab673-8208-44dd-8dbd-4cfd3f66947b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:28.806333 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:28.806304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:28.806472 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:28.806349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:28.806472 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:28.806385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:28.806472 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:28.806421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:28.806636 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806505 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.806482225 +0000 UTC m=+48.810650213 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : configmap references non-existent config key: service-ca.crt Apr 16 18:03:28.806636 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806572 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:03:28.806636 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806576 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:28.806636 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806593 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bd68bc8b-btvvd: secret "image-registry-tls" not found Apr 16 18:03:28.806636 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806592 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:03:28.806866 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806651 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.806635217 +0000 UTC m=+48.810803216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : secret "router-metrics-certs-default" not found Apr 16 18:03:28.806866 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806670 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls podName:6509e47c-1e65-4e88-b44a-91bf5ba93351 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.806660787 +0000 UTC m=+48.810828778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls") pod "cluster-samples-operator-667775844f-xsspj" (UID: "6509e47c-1e65-4e88-b44a-91bf5ba93351") : secret "samples-operator-tls" not found Apr 16 18:03:28.806866 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:28.806686 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls podName:635e7d14-be03-47cc-ba03-fc37558d4103 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.806678063 +0000 UTC m=+48.810846050 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls") pod "image-registry-77bd68bc8b-btvvd" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103") : secret "image-registry-tls" not found Apr 16 18:03:29.008958 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:29.008849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:29.008958 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:29.008930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:29.009222 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:29.009007 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:29.009222 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:29.009022 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:29.009222 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:29.009074 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls podName:44d91f6d-be25-4f64-af47-6cdda8f2bfb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:37.009052344 +0000 UTC m=+49.013220334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls") pod "dns-default-c6tf9" (UID: "44d91f6d-be25-4f64-af47-6cdda8f2bfb6") : secret "dns-default-metrics-tls" not found Apr 16 18:03:29.009222 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:29.009093 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert podName:85ac3502-8838-441d-985c-4dd2dc6e803c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:37.00908385 +0000 UTC m=+49.013251841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert") pod "ingress-canary-hddck" (UID: "85ac3502-8838-441d-985c-4dd2dc6e803c") : secret "canary-serving-cert" not found Apr 16 18:03:36.182666 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.182642 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-28sv4"] Apr 16 18:03:36.195198 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:36.195151 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd513cc_2c53_4020_94b3_faf51a11b03f.slice/crio-0d01a124e914a767ad709960572578f357feb85f8284c67e4793eb6b20a79609 WatchSource:0}: Error finding container 0d01a124e914a767ad709960572578f357feb85f8284c67e4793eb6b20a79609: Status 404 returned error can't find the container with id 0d01a124e914a767ad709960572578f357feb85f8284c67e4793eb6b20a79609 Apr 16 18:03:36.777508 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.777410 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:36.777679 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.777568 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:36.777679 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.777641 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls podName:fadab673-8208-44dd-8dbd-4cfd3f66947b nodeName:}" failed. No retries permitted until 2026-04-16 18:03:52.777620399 +0000 UTC m=+64.781788397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-464rm" (UID: "fadab673-8208-44dd-8dbd-4cfd3f66947b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:03:36.878412 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.878382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:36.878585 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.878499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:36.878585 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.878534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:36.878585 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.878564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:36.878741 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878696 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:36.878741 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878709 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bd68bc8b-btvvd: secret "image-registry-tls" not found Apr 16 18:03:36.878843 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878762 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls podName:635e7d14-be03-47cc-ba03-fc37558d4103 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:52.878744316 +0000 UTC m=+64.882912305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls") pod "image-registry-77bd68bc8b-btvvd" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103") : secret "image-registry-tls" not found Apr 16 18:03:36.878843 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878822 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:03:36.878949 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878851 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls podName:6509e47c-1e65-4e88-b44a-91bf5ba93351 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:52.878841843 +0000 UTC m=+64.883009831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls") pod "cluster-samples-operator-667775844f-xsspj" (UID: "6509e47c-1e65-4e88-b44a-91bf5ba93351") : secret "samples-operator-tls" not found Apr 16 18:03:36.878949 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878909 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:52.878900448 +0000 UTC m=+64.883068437 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : configmap references non-existent config key: service-ca.crt Apr 16 18:03:36.879060 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878954 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:03:36.879060 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:36.878979 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs podName:76309363-7d66-4131-9509-98c2fcc90649 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:52.878971089 +0000 UTC m=+64.883139079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs") pod "router-default-97bcf5c66-dbdrj" (UID: "76309363-7d66-4131-9509-98c2fcc90649") : secret "router-metrics-certs-default" not found Apr 16 18:03:36.930264 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.930145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" event={"ID":"f837e66e-5440-4461-ae9b-0bff515a395f","Type":"ContainerStarted","Data":"d6d903872a5dd754eeedbba45f4c2314c63df98dbc9a9f6f969fa0d45bd116ba"} Apr 16 18:03:36.932191 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.931941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" event={"ID":"1581c4ae-cdee-4c3c-8bb7-c74cf465de66","Type":"ContainerStarted","Data":"c8b7024416a7ed9dc4bec67c6a029e6a796a719ba875a93c3e1fbf2d6e9f8a36"} Apr 16 18:03:36.934236 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.934214 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" event={"ID":"f8ed6c21-e3ad-417e-8b73-a21ef70ba241","Type":"ContainerStarted","Data":"e5fd60b1684a1dc8ce1f1ff5ae7dfaa5722007819e821afd6f4a9013bbea6e65"} Apr 16 18:03:36.938514 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.938320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" event={"ID":"3ee28cfd-b76c-488a-8374-405ee3a9a635","Type":"ContainerStarted","Data":"8ac7d93ee695a09233fb458cbbcba4bf810bbc986c4c84002e24baffce885441"} Apr 16 18:03:36.940054 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.940034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" event={"ID":"4a741170-26d5-4d55-bd37-3e869323218c","Type":"ContainerStarted","Data":"1c1d9454872a5cfa99a5c732469b542f5702c960eb9cba9d70f4fc1de211f3d7"} Apr 16 18:03:36.941900 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.941878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" event={"ID":"8e03b2b2-a432-4859-bef2-74d7f9646342","Type":"ContainerStarted","Data":"c4bdd25cf45ed2d9d1e9dfbf47ae07846f1a41f42802221e5c686302d3f34948"} Apr 16 18:03:36.943315 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.943296 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:36.944734 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.944715 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" Apr 16 18:03:36.944908 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.944887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" event={"ID":"d8511e45-12bd-403d-adca-66af780a5704","Type":"ContainerStarted","Data":"ce7015dc84325af3ca66f8352373482049ac38fb392f95bd0486e7f484747101"} Apr 16 18:03:36.946807 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.946400 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" event={"ID":"e5b3a3bd-16e5-4438-837b-7f24def37fc3","Type":"ContainerStarted","Data":"1ca5728ca0413ca6a4b0cec69064cac6465b1a4178170db346368cae6cfbd220"} Apr 16 18:03:36.948757 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.948738 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/0.log" Apr 16 18:03:36.948858 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.948773 2578 generic.go:358] "Generic (PLEG): container finished" podID="bea2a2a8-4d21-4fd7-978c-0b7af8200cd7" containerID="a66615225034a6da01a0c736748b402ae091c8ed8e7586a4788b354f07e8c642" exitCode=255 Apr 16 18:03:36.949045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.949006 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" event={"ID":"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7","Type":"ContainerDied","Data":"a66615225034a6da01a0c736748b402ae091c8ed8e7586a4788b354f07e8c642"} Apr 16 18:03:36.949102 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.949092 2578 scope.go:117] "RemoveContainer" containerID="a66615225034a6da01a0c736748b402ae091c8ed8e7586a4788b354f07e8c642" Apr 16 18:03:36.950712 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.950333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" event={"ID":"9dd185dc-97ca-4edf-a263-a8115564eb69","Type":"ContainerStarted","Data":"8232f98a5165040b01dd0b4f696dde34e0b4dd1931cb8f37be1da328acb756ef"} Apr 16 18:03:36.952483 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.952460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-28sv4" event={"ID":"6fd513cc-2c53-4020-94b3-faf51a11b03f","Type":"ContainerStarted","Data":"0d01a124e914a767ad709960572578f357feb85f8284c67e4793eb6b20a79609"} Apr 16 18:03:36.954957 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.954934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6tmgb" event={"ID":"a336e08f-92e1-4f5f-99d6-9f8231b01727","Type":"ContainerStarted","Data":"e16c3194e167736a93957baeb8dce09dacecab61ea87c8b4fab0d9237adfca5a"} Apr 16 18:03:36.955229 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.955214 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:03:36.960891 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.960849 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" podStartSLOduration=23.586418512 podStartE2EDuration="35.960836442s" podCreationTimestamp="2026-04-16 18:03:01 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.727824677 +0000 UTC m=+35.731992676" lastFinishedPulling="2026-04-16 18:03:36.102242608 +0000 UTC m=+48.106410606" observedRunningTime="2026-04-16 18:03:36.95984706 +0000 UTC m=+48.964015070" watchObservedRunningTime="2026-04-16 18:03:36.960836442 +0000 UTC m=+48.965004451" Apr 16 18:03:36.992222 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:36.991500 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n6d8n" podStartSLOduration=16.325789409 podStartE2EDuration="48.991486278s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:02:51.129570361 +0000 UTC m=+3.133738345" lastFinishedPulling="2026-04-16 18:03:23.795267226 +0000 UTC m=+35.799435214" observedRunningTime="2026-04-16 18:03:36.991099997 +0000 UTC m=+48.995268005" watchObservedRunningTime="2026-04-16 18:03:36.991486278 +0000 UTC m=+48.995654276" Apr 16 18:03:37.022348 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.021944 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cf6d9648b-6vlhj" podStartSLOduration=31.577369836 podStartE2EDuration="44.021927543s" podCreationTimestamp="2026-04-16 18:02:53 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.771752908 +0000 UTC m=+35.775920896" lastFinishedPulling="2026-04-16 18:03:36.216310615 +0000 UTC m=+48.220478603" observedRunningTime="2026-04-16 18:03:37.020797935 +0000 UTC m=+49.024965946" watchObservedRunningTime="2026-04-16 18:03:37.021927543 +0000 UTC m=+49.026095553" Apr 16 18:03:37.062836 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.062746 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xc7ff" podStartSLOduration=30.732755904 podStartE2EDuration="43.062731659s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.726679708 +0000 UTC m=+35.730847693" lastFinishedPulling="2026-04-16 18:03:36.056655448 +0000 UTC m=+48.060823448" observedRunningTime="2026-04-16 18:03:37.061733706 +0000 UTC m=+49.065901728" watchObservedRunningTime="2026-04-16 18:03:37.062731659 +0000 UTC m=+49.066899667" Apr 16 18:03:37.080399 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.080259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:37.080902 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.080658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:37.080902 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:37.080384 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:37.080902 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:37.080844 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls podName:44d91f6d-be25-4f64-af47-6cdda8f2bfb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:53.080828403 +0000 UTC m=+65.084996387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls") pod "dns-default-c6tf9" (UID: "44d91f6d-be25-4f64-af47-6cdda8f2bfb6") : secret "dns-default-metrics-tls" not found Apr 16 18:03:37.080902 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:37.080796 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:37.080902 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:37.080881 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert podName:85ac3502-8838-441d-985c-4dd2dc6e803c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:53.080871879 +0000 UTC m=+65.085039866 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert") pod "ingress-canary-hddck" (UID: "85ac3502-8838-441d-985c-4dd2dc6e803c") : secret "canary-serving-cert" not found Apr 16 18:03:37.083194 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.082907 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68678d9669-2vff5" podStartSLOduration=31.589052677 podStartE2EDuration="44.082893835s" podCreationTimestamp="2026-04-16 18:02:53 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.724589258 +0000 UTC m=+35.728757245" lastFinishedPulling="2026-04-16 18:03:36.218430412 +0000 UTC m=+48.222598403" observedRunningTime="2026-04-16 18:03:37.080439415 +0000 UTC m=+49.084607425" watchObservedRunningTime="2026-04-16 18:03:37.082893835 +0000 UTC m=+49.087061844" Apr 16 18:03:37.143913 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.142444 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" podStartSLOduration=33.771292397 podStartE2EDuration="46.142426847s" podCreationTimestamp="2026-04-16 18:02:51 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.731107283 +0000 UTC m=+35.735275271" lastFinishedPulling="2026-04-16 18:03:36.102241721 +0000 UTC m=+48.106409721" observedRunningTime="2026-04-16 18:03:37.106284236 +0000 UTC m=+49.110452244" watchObservedRunningTime="2026-04-16 18:03:37.142426847 +0000 UTC m=+49.146594855" Apr 16 18:03:37.169766 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.168565 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6tmgb" podStartSLOduration=36.723794148 podStartE2EDuration="49.168548257s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.771769575 +0000 UTC m=+35.775937567" lastFinishedPulling="2026-04-16 18:03:36.216523687 +0000 UTC m=+48.220691676" observedRunningTime="2026-04-16 18:03:37.143762678 +0000 UTC m=+49.147930685" watchObservedRunningTime="2026-04-16 18:03:37.168548257 +0000 UTC m=+49.172716265" Apr 16 18:03:37.189599 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.188475 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-k687t" podStartSLOduration=23.686443938 podStartE2EDuration="36.188459968s" podCreationTimestamp="2026-04-16 18:03:01 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.725570534 +0000 UTC m=+35.729738523" lastFinishedPulling="2026-04-16 18:03:36.227586565 +0000 UTC m=+48.231754553" observedRunningTime="2026-04-16 18:03:37.187550876 +0000 UTC m=+49.191718883" watchObservedRunningTime="2026-04-16 18:03:37.188459968 +0000 UTC m=+49.192627976" Apr 16 18:03:37.190601 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.190363 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" podStartSLOduration=25.905286846 podStartE2EDuration="38.190350818s" podCreationTimestamp="2026-04-16 18:02:59 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.771589993 +0000 UTC m=+35.775757981" lastFinishedPulling="2026-04-16 18:03:36.056653953 +0000 UTC m=+48.060821953" observedRunningTime="2026-04-16 18:03:37.169561458 +0000 UTC m=+49.173729466" watchObservedRunningTime="2026-04-16 18:03:37.190350818 +0000 UTC m=+49.194518827" Apr 16 18:03:37.969589 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.969500 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:03:37.970218 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.970187 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/0.log" Apr 16 18:03:37.970349 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.970234 2578 generic.go:358] "Generic (PLEG): container finished" podID="bea2a2a8-4d21-4fd7-978c-0b7af8200cd7" containerID="c5ec1984099229a1e624b1df49babdb83b34b49bbb90055871402161b2e41c4f" exitCode=255 Apr 16 18:03:37.971394 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.971366 2578 scope.go:117] "RemoveContainer" containerID="c5ec1984099229a1e624b1df49babdb83b34b49bbb90055871402161b2e41c4f" Apr 16 18:03:37.971569 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:37.971541 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4t7fq_openshift-console-operator(bea2a2a8-4d21-4fd7-978c-0b7af8200cd7)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" podUID="bea2a2a8-4d21-4fd7-978c-0b7af8200cd7" Apr 16 18:03:37.971812 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.971792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" event={"ID":"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7","Type":"ContainerDied","Data":"c5ec1984099229a1e624b1df49babdb83b34b49bbb90055871402161b2e41c4f"} Apr 16 18:03:37.971940 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:37.971839 2578 scope.go:117] "RemoveContainer" containerID="a66615225034a6da01a0c736748b402ae091c8ed8e7586a4788b354f07e8c642" Apr 16 18:03:38.974464 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:38.974435 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:03:38.974853 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:38.974781 2578 scope.go:117] "RemoveContainer" containerID="c5ec1984099229a1e624b1df49babdb83b34b49bbb90055871402161b2e41c4f" Apr 16 18:03:38.974972 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:38.974955 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4t7fq_openshift-console-operator(bea2a2a8-4d21-4fd7-978c-0b7af8200cd7)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" podUID="bea2a2a8-4d21-4fd7-978c-0b7af8200cd7" Apr 16 18:03:39.764541 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:39.764486 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qnnb2_6497ce92-67e0-497d-b2db-ccc3571b7753/dns-node-resolver/0.log" Apr 16 18:03:40.359437 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:40.359400 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dmn8s_60136db5-eb71-48af-b059-62d18f47a211/node-ca/0.log" Apr 16 18:03:41.366163 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:41.366132 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:41.366455 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:41.366185 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:41.366511 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:41.366498 2578 scope.go:117] "RemoveContainer" containerID="c5ec1984099229a1e624b1df49babdb83b34b49bbb90055871402161b2e41c4f" Apr 16 18:03:41.366666 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:03:41.366650 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4t7fq_openshift-console-operator(bea2a2a8-4d21-4fd7-978c-0b7af8200cd7)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" podUID="bea2a2a8-4d21-4fd7-978c-0b7af8200cd7" Apr 16 18:03:41.988767 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:41.988730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-28sv4" event={"ID":"6fd513cc-2c53-4020-94b3-faf51a11b03f","Type":"ContainerStarted","Data":"b9f3387e054a84bb56a6090d909ddebf391f58a5129481cfeb416ea6c045433a"} Apr 16 18:03:41.990562 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:41.990534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" event={"ID":"d8511e45-12bd-403d-adca-66af780a5704","Type":"ContainerStarted","Data":"590674fe0cdef4b48b99996aa73c87644b6286685b64b425c978730fc9cbce20"} Apr 16 18:03:41.990672 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:41.990565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" event={"ID":"d8511e45-12bd-403d-adca-66af780a5704","Type":"ContainerStarted","Data":"7d7856a57dc8444738e48ca54d1ee621ca8d72e8b1886ec53830a37fb56b22ac"} Apr 16 18:03:42.004871 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:42.004828 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-28sv4" podStartSLOduration=42.968766062 podStartE2EDuration="48.00481532s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:03:36.215562969 +0000 UTC m=+48.219730969" lastFinishedPulling="2026-04-16 18:03:41.251612237 +0000 UTC m=+53.255780227" observedRunningTime="2026-04-16 18:03:42.004002625 +0000 UTC m=+54.008170631" watchObservedRunningTime="2026-04-16 18:03:42.00481532 +0000 UTC m=+54.008983326" Apr 16 18:03:42.025764 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:42.025721 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" podStartSLOduration=31.857609338 podStartE2EDuration="49.025708254s" podCreationTimestamp="2026-04-16 18:02:53 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.725516326 +0000 UTC m=+35.729684312" lastFinishedPulling="2026-04-16 18:03:40.893615226 +0000 UTC m=+52.897783228" observedRunningTime="2026-04-16 18:03:42.02534509 +0000 UTC m=+54.029513093" watchObservedRunningTime="2026-04-16 18:03:42.025708254 +0000 UTC m=+54.029876262" Apr 16 18:03:46.705769 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:46.705737 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsvsh" Apr 16 18:03:52.557787 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.557756 2578 scope.go:117] "RemoveContainer" containerID="c5ec1984099229a1e624b1df49babdb83b34b49bbb90055871402161b2e41c4f" Apr 16 18:03:52.812722 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.812644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:52.814970 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.814940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fadab673-8208-44dd-8dbd-4cfd3f66947b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-464rm\" (UID: \"fadab673-8208-44dd-8dbd-4cfd3f66947b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:52.913694 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.913667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:52.913810 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.913702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:52.913810 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.913722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:52.913810 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.913749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:52.914439 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.914410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76309363-7d66-4131-9509-98c2fcc90649-service-ca-bundle\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:52.915975 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.915942 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"image-registry-77bd68bc8b-btvvd\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:52.915975 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.915964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6509e47c-1e65-4e88-b44a-91bf5ba93351-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xsspj\" (UID: \"6509e47c-1e65-4e88-b44a-91bf5ba93351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:52.916140 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:52.916076 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76309363-7d66-4131-9509-98c2fcc90649-metrics-certs\") pod \"router-default-97bcf5c66-dbdrj\" (UID: \"76309363-7d66-4131-9509-98c2fcc90649\") " pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:53.026087 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.026061 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:03:53.026281 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.026176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" event={"ID":"bea2a2a8-4d21-4fd7-978c-0b7af8200cd7","Type":"ContainerStarted","Data":"fe12bbdcf251783bc8600dbe54d18a20177888aadc8a350b0543bd38734eccb9"} Apr 16 18:03:53.026644 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.026605 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:53.036600 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.036571 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" Apr 16 18:03:53.047970 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.047930 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-4t7fq" podStartSLOduration=45.762849943 podStartE2EDuration="58.047915482s" podCreationTimestamp="2026-04-16 18:02:55 +0000 UTC" firstStartedPulling="2026-04-16 18:03:23.771683897 +0000 UTC m=+35.775851884" lastFinishedPulling="2026-04-16 18:03:36.056749426 +0000 UTC m=+48.060917423" observedRunningTime="2026-04-16 18:03:53.046902882 +0000 UTC m=+65.051070888" watchObservedRunningTime="2026-04-16 18:03:53.047915482 +0000 UTC m=+65.052083489" Apr 16 18:03:53.069337 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.069315 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-s4v9v\"" Apr 16 18:03:53.077670 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.077652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" Apr 16 18:03:53.100301 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.100282 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sc59r\"" Apr 16 18:03:53.107763 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.107745 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:53.116236 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.115638 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:53.116236 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.115692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:53.119265 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.119222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44d91f6d-be25-4f64-af47-6cdda8f2bfb6-metrics-tls\") pod \"dns-default-c6tf9\" (UID: \"44d91f6d-be25-4f64-af47-6cdda8f2bfb6\") " pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:53.120022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.119998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85ac3502-8838-441d-985c-4dd2dc6e803c-cert\") pod \"ingress-canary-hddck\" (UID: \"85ac3502-8838-441d-985c-4dd2dc6e803c\") " pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:53.129842 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.129594 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dnqxl\"" Apr 16 18:03:53.136825 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.136799 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:53.162766 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.162737 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6xbl7\"" Apr 16 18:03:53.171371 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.171344 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" Apr 16 18:03:53.238646 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.238133 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm"] Apr 16 18:03:53.241443 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:53.241396 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfadab673_8208_44dd_8dbd_4cfd3f66947b.slice/crio-7c433c15cb39e20d48578b25c038d1ad08a4a1c3335094c9bb8928afc856b0d8 WatchSource:0}: Error finding container 7c433c15cb39e20d48578b25c038d1ad08a4a1c3335094c9bb8928afc856b0d8: Status 404 returned error can't find the container with id 7c433c15cb39e20d48578b25c038d1ad08a4a1c3335094c9bb8928afc856b0d8 Apr 16 18:03:53.279547 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.279521 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-97bcf5c66-dbdrj"] Apr 16 18:03:53.282433 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:53.282405 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76309363_7d66_4131_9509_98c2fcc90649.slice/crio-b25c66b779d5327d25bf59087813994fc97795270ca8682ab72318be2febe707 WatchSource:0}: Error finding container b25c66b779d5327d25bf59087813994fc97795270ca8682ab72318be2febe707: Status 404 returned error can't find the container with id b25c66b779d5327d25bf59087813994fc97795270ca8682ab72318be2febe707 Apr 16 18:03:53.287752 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.287725 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkw8n\"" Apr 16 18:03:53.295979 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.295948 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hddck" Apr 16 18:03:53.296594 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.296451 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzqks\"" Apr 16 18:03:53.303519 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.303498 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77bd68bc8b-btvvd"] Apr 16 18:03:53.303900 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.303887 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6tf9" Apr 16 18:03:53.311170 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:53.311136 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635e7d14_be03_47cc_ba03_fc37558d4103.slice/crio-e19cd68d1ef10cbfe88023f24834551a3ae25b69e48d491920bb115196f3391d WatchSource:0}: Error finding container e19cd68d1ef10cbfe88023f24834551a3ae25b69e48d491920bb115196f3391d: Status 404 returned error can't find the container with id e19cd68d1ef10cbfe88023f24834551a3ae25b69e48d491920bb115196f3391d Apr 16 18:03:53.317377 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.317352 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj"] Apr 16 18:03:53.452053 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.452030 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hddck"] Apr 16 18:03:53.454257 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:53.454221 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ac3502_8838_441d_985c_4dd2dc6e803c.slice/crio-4e3876764e3ff316b8dd21f72e315d5ee9c09edf7af55ea92f85f6a8107cd71d WatchSource:0}: Error finding container 4e3876764e3ff316b8dd21f72e315d5ee9c09edf7af55ea92f85f6a8107cd71d: Status 404 returned error can't find the container with id 4e3876764e3ff316b8dd21f72e315d5ee9c09edf7af55ea92f85f6a8107cd71d Apr 16 18:03:53.469370 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:53.469276 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6tf9"] Apr 16 18:03:53.471580 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:03:53.471555 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d91f6d_be25_4f64_af47_6cdda8f2bfb6.slice/crio-3934cfad4daa0af0a8c26ff1fd0cd8250c7ccea30adb85eabbf78a7d40d5fccc WatchSource:0}: Error finding container 3934cfad4daa0af0a8c26ff1fd0cd8250c7ccea30adb85eabbf78a7d40d5fccc: Status 404 returned error can't find the container with id 3934cfad4daa0af0a8c26ff1fd0cd8250c7ccea30adb85eabbf78a7d40d5fccc Apr 16 18:03:54.034118 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.034037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" event={"ID":"6509e47c-1e65-4e88-b44a-91bf5ba93351","Type":"ContainerStarted","Data":"a99a167cc5fbd84cbba39174df8ffe19cf321682570959384411e01a5e7743ff"} Apr 16 18:03:54.036266 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.036236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" event={"ID":"635e7d14-be03-47cc-ba03-fc37558d4103","Type":"ContainerStarted","Data":"d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805"} Apr 16 18:03:54.036378 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.036275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" event={"ID":"635e7d14-be03-47cc-ba03-fc37558d4103","Type":"ContainerStarted","Data":"e19cd68d1ef10cbfe88023f24834551a3ae25b69e48d491920bb115196f3391d"} Apr 16 18:03:54.037025 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.036990 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:03:54.039919 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.039891 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" event={"ID":"76309363-7d66-4131-9509-98c2fcc90649","Type":"ContainerStarted","Data":"a09bc3e20e9e8b8fa500a2c1f51bcf5cfcf88bb8fdca6b8960d6688cc36c5629"} Apr 16 18:03:54.040007 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.039928 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" event={"ID":"76309363-7d66-4131-9509-98c2fcc90649","Type":"ContainerStarted","Data":"b25c66b779d5327d25bf59087813994fc97795270ca8682ab72318be2febe707"} Apr 16 18:03:54.042189 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.042118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" event={"ID":"fadab673-8208-44dd-8dbd-4cfd3f66947b","Type":"ContainerStarted","Data":"7c433c15cb39e20d48578b25c038d1ad08a4a1c3335094c9bb8928afc856b0d8"} Apr 16 18:03:54.043832 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.043805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6tf9" event={"ID":"44d91f6d-be25-4f64-af47-6cdda8f2bfb6","Type":"ContainerStarted","Data":"3934cfad4daa0af0a8c26ff1fd0cd8250c7ccea30adb85eabbf78a7d40d5fccc"} Apr 16 18:03:54.045476 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.045452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hddck" event={"ID":"85ac3502-8838-441d-985c-4dd2dc6e803c","Type":"ContainerStarted","Data":"4e3876764e3ff316b8dd21f72e315d5ee9c09edf7af55ea92f85f6a8107cd71d"} Apr 16 18:03:54.065598 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.064469 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" podStartSLOduration=65.064454124 podStartE2EDuration="1m5.064454124s" podCreationTimestamp="2026-04-16 18:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:54.063441056 +0000 UTC m=+66.067609064" watchObservedRunningTime="2026-04-16 18:03:54.064454124 +0000 UTC m=+66.068622133" Apr 16 18:03:54.108679 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.108653 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:54.111701 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.111681 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:54.138861 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.138816 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" podStartSLOduration=63.138800912 podStartE2EDuration="1m3.138800912s" podCreationTimestamp="2026-04-16 18:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:54.091417585 +0000 UTC m=+66.095585593" watchObservedRunningTime="2026-04-16 18:03:54.138800912 +0000 UTC m=+66.142968921" Apr 16 18:03:54.328669 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.328636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:54.331845 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.331594 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:54.341768 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.341706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f073ea3-db3b-4eaa-9a74-db58c9d97b21-metrics-certs\") pod \"network-metrics-daemon-ndzmp\" (UID: \"2f073ea3-db3b-4eaa-9a74-db58c9d97b21\") " pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:54.384744 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.384723 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qtxn\"" Apr 16 18:03:54.392112 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.392082 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndzmp" Apr 16 18:03:54.573379 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:54.573326 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ndzmp"] Apr 16 18:03:55.050612 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:55.050503 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ndzmp" event={"ID":"2f073ea3-db3b-4eaa-9a74-db58c9d97b21","Type":"ContainerStarted","Data":"9af47c1ab964fa847a5434f017d142f0c3ca95c4ddaf56e93a8e43a745d69fb4"} Apr 16 18:03:55.051108 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:55.051086 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:55.052233 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:55.052206 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-97bcf5c66-dbdrj" Apr 16 18:03:57.061901 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.061835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6tf9" event={"ID":"44d91f6d-be25-4f64-af47-6cdda8f2bfb6","Type":"ContainerStarted","Data":"a8108768cb0fbe9642d8ed5aaa75d1496340ad352ca019306e1f8299ebc5711d"} Apr 16 18:03:57.071131 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.070600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hddck" event={"ID":"85ac3502-8838-441d-985c-4dd2dc6e803c","Type":"ContainerStarted","Data":"4c4222a5bb143fa30a660bdbacdfce10f18bac82fa599c351d3b233330adbbf4"} Apr 16 18:03:57.075022 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.074130 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" event={"ID":"fadab673-8208-44dd-8dbd-4cfd3f66947b","Type":"ContainerStarted","Data":"a33d2ae1343a3aa9c0e8579b28a998a337106e9821c06996a912b8332fb6bb14"} Apr 16 18:03:57.085064 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.084975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ndzmp" event={"ID":"2f073ea3-db3b-4eaa-9a74-db58c9d97b21","Type":"ContainerStarted","Data":"cb0b20338918ff07935571142fd43636aa22e016adcb2b5fef8a2134f9481fc9"} Apr 16 18:03:57.092715 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.092131 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hddck" podStartSLOduration=33.723001412 podStartE2EDuration="37.092114527s" podCreationTimestamp="2026-04-16 18:03:20 +0000 UTC" firstStartedPulling="2026-04-16 18:03:53.456333068 +0000 UTC m=+65.460501060" lastFinishedPulling="2026-04-16 18:03:56.82544618 +0000 UTC m=+68.829614175" observedRunningTime="2026-04-16 18:03:57.090785523 +0000 UTC m=+69.094953531" watchObservedRunningTime="2026-04-16 18:03:57.092114527 +0000 UTC m=+69.096282535" Apr 16 18:03:57.094620 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.094570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" event={"ID":"6509e47c-1e65-4e88-b44a-91bf5ba93351","Type":"ContainerStarted","Data":"d124898e60de7cb42e9b72297d53d6f47ed2e6f0c28646028e0a25f486c2027a"} Apr 16 18:03:57.094620 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.094598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" event={"ID":"6509e47c-1e65-4e88-b44a-91bf5ba93351","Type":"ContainerStarted","Data":"0504b99b331c08b96db18cde7388eedbfe4104c2c46247fd4072a49380301332"} Apr 16 18:03:57.136570 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.134932 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xsspj" podStartSLOduration=59.671660457 podStartE2EDuration="1m3.134914981s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:03:53.362259959 +0000 UTC m=+65.366427946" lastFinishedPulling="2026-04-16 18:03:56.825514482 +0000 UTC m=+68.829682470" observedRunningTime="2026-04-16 18:03:57.133964926 +0000 UTC m=+69.138132934" watchObservedRunningTime="2026-04-16 18:03:57.134914981 +0000 UTC m=+69.139082988" Apr 16 18:03:57.136570 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:57.136150 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-464rm" podStartSLOduration=62.553977028 podStartE2EDuration="1m6.136137107s" podCreationTimestamp="2026-04-16 18:02:51 +0000 UTC" firstStartedPulling="2026-04-16 18:03:53.243592811 +0000 UTC m=+65.247760800" lastFinishedPulling="2026-04-16 18:03:56.825752879 +0000 UTC m=+68.829920879" observedRunningTime="2026-04-16 18:03:57.115262965 +0000 UTC m=+69.119430973" watchObservedRunningTime="2026-04-16 18:03:57.136137107 +0000 UTC m=+69.140305115" Apr 16 18:03:58.098581 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:58.098477 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ndzmp" event={"ID":"2f073ea3-db3b-4eaa-9a74-db58c9d97b21","Type":"ContainerStarted","Data":"3681219aff9802ffb17310a00e0c343eb258e6ae6a7e2f181c2c52e6a884002c"} Apr 16 18:03:58.100096 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:58.100071 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6tf9" event={"ID":"44d91f6d-be25-4f64-af47-6cdda8f2bfb6","Type":"ContainerStarted","Data":"2881b6967e65ea786b7f564a202fa62db225a804e1d8f2eae9643e71a1718920"} Apr 16 18:03:58.120630 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:58.120583 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ndzmp" podStartSLOduration=67.816839635 podStartE2EDuration="1m10.120567415s" podCreationTimestamp="2026-04-16 18:02:48 +0000 UTC" firstStartedPulling="2026-04-16 18:03:54.580903212 +0000 UTC m=+66.585071200" lastFinishedPulling="2026-04-16 18:03:56.884630995 +0000 UTC m=+68.888798980" observedRunningTime="2026-04-16 18:03:58.115727743 +0000 UTC m=+70.119895751" watchObservedRunningTime="2026-04-16 18:03:58.120567415 +0000 UTC m=+70.124735438" Apr 16 18:03:58.134767 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:58.134721 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-c6tf9" podStartSLOduration=33.782669089 podStartE2EDuration="37.134711973s" podCreationTimestamp="2026-04-16 18:03:21 +0000 UTC" firstStartedPulling="2026-04-16 18:03:53.473434581 +0000 UTC m=+65.477602566" lastFinishedPulling="2026-04-16 18:03:56.825477457 +0000 UTC m=+68.829645450" observedRunningTime="2026-04-16 18:03:58.134360369 +0000 UTC m=+70.138528376" watchObservedRunningTime="2026-04-16 18:03:58.134711973 +0000 UTC m=+70.138879979" Apr 16 18:03:59.103824 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:03:59.103786 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-c6tf9" Apr 16 18:04:01.088281 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.088252 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9mttl"] Apr 16 18:04:01.094236 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.094218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.097471 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.097446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nw878\"" Apr 16 18:04:01.097471 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.097460 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:04:01.098734 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.098706 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:04:01.109867 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.109844 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9mttl"] Apr 16 18:04:01.186187 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.186141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.186314 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.186264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-data-volume\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.186314 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.186300 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.186392 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.186324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-crio-socket\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.186392 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.186338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ntb\" (UniqueName: \"kubernetes.io/projected/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-kube-api-access-j2ntb\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.287624 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.287594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-data-volume\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.287750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.287633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.287750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.287657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-crio-socket\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.287750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.287677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ntb\" (UniqueName: \"kubernetes.io/projected/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-kube-api-access-j2ntb\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.287750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.287719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.287887 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.287782 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-crio-socket\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.288002 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.287983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-data-volume\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.288256 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.288240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.290217 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.290192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.297750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.297730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ntb\" (UniqueName: \"kubernetes.io/projected/ea1a3052-f4bc-4b70-a927-a2cf7dd3017a-kube-api-access-j2ntb\") pod \"insights-runtime-extractor-9mttl\" (UID: \"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a\") " pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.404465 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.404413 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9mttl" Apr 16 18:04:01.529634 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:01.529605 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9mttl"] Apr 16 18:04:01.533726 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:04:01.533682 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea1a3052_f4bc_4b70_a927_a2cf7dd3017a.slice/crio-6f239d719cd024ba46d672cabee935850418ce99ec574bf0eda0442217b40819 WatchSource:0}: Error finding container 6f239d719cd024ba46d672cabee935850418ce99ec574bf0eda0442217b40819: Status 404 returned error can't find the container with id 6f239d719cd024ba46d672cabee935850418ce99ec574bf0eda0442217b40819 Apr 16 18:04:02.117778 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:02.117747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mttl" event={"ID":"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a","Type":"ContainerStarted","Data":"db642f5e49d625418af09af4a82961c59ecc2dacfc1896b4db5f1a0bb2e2ca94"} Apr 16 18:04:02.117778 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:02.117783 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mttl" event={"ID":"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a","Type":"ContainerStarted","Data":"6f239d719cd024ba46d672cabee935850418ce99ec574bf0eda0442217b40819"} Apr 16 18:04:04.123802 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:04.123772 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mttl" event={"ID":"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a","Type":"ContainerStarted","Data":"5558241581b507e1592a36b99ef7a780f338548639766ecd28866b0c9c59d60d"} Apr 16 18:04:06.107921 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:06.107888 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-c6tf9" Apr 16 18:04:06.131799 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:06.131770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mttl" event={"ID":"ea1a3052-f4bc-4b70-a927-a2cf7dd3017a","Type":"ContainerStarted","Data":"ce26a027f4fa4680a93687c00e759d8b674840444277a1fb1cce07ae712ed530"} Apr 16 18:04:06.151257 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:06.151215 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9mttl" podStartSLOduration=0.90363975 podStartE2EDuration="5.151201336s" podCreationTimestamp="2026-04-16 18:04:01 +0000 UTC" firstStartedPulling="2026-04-16 18:04:01.692418058 +0000 UTC m=+73.696586043" lastFinishedPulling="2026-04-16 18:04:05.939979644 +0000 UTC m=+77.944147629" observedRunningTime="2026-04-16 18:04:06.150539568 +0000 UTC m=+78.154707578" watchObservedRunningTime="2026-04-16 18:04:06.151201336 +0000 UTC m=+78.155369345" Apr 16 18:04:07.974127 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:07.974089 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6tmgb" Apr 16 18:04:12.069324 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.069295 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-m6mr7"] Apr 16 18:04:12.100816 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.100786 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.103835 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.103811 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:04:12.104262 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.104241 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:04:12.105099 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.105081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dln79\"" Apr 16 18:04:12.105213 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.105089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:04:12.105213 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.105114 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:04:12.263421 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263393 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-root\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263584 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-sys\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263584 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263584 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263483 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-metrics-client-ca\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263584 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-tls\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263584 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263533 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-accelerators-collector-config\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263847 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-textfile\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263847 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-wtmp\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.263847 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.263737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fdhd\" (UniqueName: \"kubernetes.io/projected/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-kube-api-access-6fdhd\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364560 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-metrics-client-ca\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364560 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-tls\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-accelerators-collector-config\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-textfile\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-wtmp\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fdhd\" (UniqueName: \"kubernetes.io/projected/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-kube-api-access-6fdhd\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-root\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.364750 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-root\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.365047 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-sys\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.365047 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.365047 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-wtmp\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.365047 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.364864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-sys\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.365285 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.365086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-textfile\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.365285 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.365184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-metrics-client-ca\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.365285 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.365242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-accelerators-collector-config\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.367045 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.367025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-tls\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.367188 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.367152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.373099 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.373074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fdhd\" (UniqueName: \"kubernetes.io/projected/d6d2180f-f27c-46d9-be7d-f6c5cd480f95-kube-api-access-6fdhd\") pod \"node-exporter-m6mr7\" (UID: \"d6d2180f-f27c-46d9-be7d-f6c5cd480f95\") " pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.410205 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:12.410176 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-m6mr7" Apr 16 18:04:12.419381 ip-10-0-142-167 kubenswrapper[2578]: W0416 18:04:12.419360 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6d2180f_f27c_46d9_be7d_f6c5cd480f95.slice/crio-9a063abf4568c5a214a350bc594552d5abbf1846c6005dbb56a20005b72cd3bc WatchSource:0}: Error finding container 9a063abf4568c5a214a350bc594552d5abbf1846c6005dbb56a20005b72cd3bc: Status 404 returned error can't find the container with id 9a063abf4568c5a214a350bc594552d5abbf1846c6005dbb56a20005b72cd3bc Apr 16 18:04:13.143945 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:13.143912 2578 patch_prober.go:28] interesting pod/image-registry-77bd68bc8b-btvvd container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:04:13.144308 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:13.143966 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" podUID="635e7d14-be03-47cc-ba03-fc37558d4103" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:04:13.151389 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:13.151360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m6mr7" event={"ID":"d6d2180f-f27c-46d9-be7d-f6c5cd480f95","Type":"ContainerStarted","Data":"9a063abf4568c5a214a350bc594552d5abbf1846c6005dbb56a20005b72cd3bc"} Apr 16 18:04:14.154855 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:14.154828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m6mr7" event={"ID":"d6d2180f-f27c-46d9-be7d-f6c5cd480f95","Type":"ContainerStarted","Data":"0ebd71e828f68349f3ce735d8e186b17666bb2e3dae8f2fba834a4bc2d03ca61"} Apr 16 18:04:15.159589 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:15.159553 2578 generic.go:358] "Generic (PLEG): container finished" podID="d6d2180f-f27c-46d9-be7d-f6c5cd480f95" containerID="0ebd71e828f68349f3ce735d8e186b17666bb2e3dae8f2fba834a4bc2d03ca61" exitCode=0 Apr 16 18:04:15.159966 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:15.159608 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m6mr7" event={"ID":"d6d2180f-f27c-46d9-be7d-f6c5cd480f95","Type":"ContainerDied","Data":"0ebd71e828f68349f3ce735d8e186b17666bb2e3dae8f2fba834a4bc2d03ca61"} Apr 16 18:04:16.057602 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:16.057579 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:04:16.169309 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:16.169271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m6mr7" event={"ID":"d6d2180f-f27c-46d9-be7d-f6c5cd480f95","Type":"ContainerStarted","Data":"751ca607466b6c056618720b7e207d9e0723a83e35ef6426ef276705c3e00200"} Apr 16 18:04:16.169309 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:16.169313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m6mr7" event={"ID":"d6d2180f-f27c-46d9-be7d-f6c5cd480f95","Type":"ContainerStarted","Data":"72b21cd38d1233ea04601dbb67a2636757944a256c9cf2fb29c52561fb597aba"} Apr 16 18:04:16.188417 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:16.188347 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-m6mr7" podStartSLOduration=2.528457651 podStartE2EDuration="4.188330201s" podCreationTimestamp="2026-04-16 18:04:12 +0000 UTC" firstStartedPulling="2026-04-16 18:04:12.421419618 +0000 UTC m=+84.425587606" lastFinishedPulling="2026-04-16 18:04:14.081292172 +0000 UTC m=+86.085460156" observedRunningTime="2026-04-16 18:04:16.187996372 +0000 UTC m=+88.192164394" watchObservedRunningTime="2026-04-16 18:04:16.188330201 +0000 UTC m=+88.192498209" Apr 16 18:04:23.594962 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:23.594928 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77bd68bc8b-btvvd"] Apr 16 18:04:47.253325 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:47.253249 2578 generic.go:358] "Generic (PLEG): container finished" podID="f837e66e-5440-4461-ae9b-0bff515a395f" containerID="d6d903872a5dd754eeedbba45f4c2314c63df98dbc9a9f6f969fa0d45bd116ba" exitCode=0 Apr 16 18:04:47.253666 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:47.253324 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" event={"ID":"f837e66e-5440-4461-ae9b-0bff515a395f","Type":"ContainerDied","Data":"d6d903872a5dd754eeedbba45f4c2314c63df98dbc9a9f6f969fa0d45bd116ba"} Apr 16 18:04:47.253666 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:47.253596 2578 scope.go:117] "RemoveContainer" containerID="d6d903872a5dd754eeedbba45f4c2314c63df98dbc9a9f6f969fa0d45bd116ba" Apr 16 18:04:48.257768 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.257736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-fdmn6" event={"ID":"f837e66e-5440-4461-ae9b-0bff515a395f","Type":"ContainerStarted","Data":"24f0d5fe190d3276044ce1a6b404e761889a822f1928fb3d46d949d830f598ff"} Apr 16 18:04:48.614094 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.614039 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" podUID="635e7d14-be03-47cc-ba03-fc37558d4103" containerName="registry" containerID="cri-o://d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805" gracePeriod=30 Apr 16 18:04:48.854864 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.854842 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:04:48.924370 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924308 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.924483 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924374 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-image-registry-private-configuration\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.924483 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924410 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-trusted-ca\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.924483 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924431 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-registry-certificates\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.924483 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924447 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t6ds\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-kube-api-access-9t6ds\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.924483 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924465 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635e7d14-be03-47cc-ba03-fc37558d4103-ca-trust-extracted\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.924726 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924493 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-bound-sa-token\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.924726 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924518 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-installation-pull-secrets\") pod \"635e7d14-be03-47cc-ba03-fc37558d4103\" (UID: \"635e7d14-be03-47cc-ba03-fc37558d4103\") " Apr 16 18:04:48.925015 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.924979 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:48.925015 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.925000 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:48.926774 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.926746 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:48.926923 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.926899 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:48.927320 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.926986 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:48.927320 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.927186 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-kube-api-access-9t6ds" (OuterVolumeSpecName: "kube-api-access-9t6ds") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "kube-api-access-9t6ds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:48.927320 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.927302 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:48.933399 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:48.933374 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/635e7d14-be03-47cc-ba03-fc37558d4103-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "635e7d14-be03-47cc-ba03-fc37558d4103" (UID: "635e7d14-be03-47cc-ba03-fc37558d4103"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:49.025733 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025714 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-trusted-ca\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.025814 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025734 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635e7d14-be03-47cc-ba03-fc37558d4103-registry-certificates\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.025814 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025746 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9t6ds\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-kube-api-access-9t6ds\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.025814 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025756 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635e7d14-be03-47cc-ba03-fc37558d4103-ca-trust-extracted\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.025814 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025765 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-bound-sa-token\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.025814 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025774 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-installation-pull-secrets\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.025814 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025782 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635e7d14-be03-47cc-ba03-fc37558d4103-registry-tls\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.025814 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.025792 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635e7d14-be03-47cc-ba03-fc37558d4103-image-registry-private-configuration\") on node \"ip-10-0-142-167.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.261116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.261040 2578 generic.go:358] "Generic (PLEG): container finished" podID="635e7d14-be03-47cc-ba03-fc37558d4103" containerID="d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805" exitCode=0 Apr 16 18:04:49.261116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.261085 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" event={"ID":"635e7d14-be03-47cc-ba03-fc37558d4103","Type":"ContainerDied","Data":"d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805"} Apr 16 18:04:49.261116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.261096 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" Apr 16 18:04:49.261116 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.261108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bd68bc8b-btvvd" event={"ID":"635e7d14-be03-47cc-ba03-fc37558d4103","Type":"ContainerDied","Data":"e19cd68d1ef10cbfe88023f24834551a3ae25b69e48d491920bb115196f3391d"} Apr 16 18:04:49.261627 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.261124 2578 scope.go:117] "RemoveContainer" containerID="d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805" Apr 16 18:04:49.268861 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.268843 2578 scope.go:117] "RemoveContainer" containerID="d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805" Apr 16 18:04:49.269119 ip-10-0-142-167 kubenswrapper[2578]: E0416 18:04:49.269099 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805\": container with ID starting with d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805 not found: ID does not exist" containerID="d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805" Apr 16 18:04:49.269225 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.269125 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805"} err="failed to get container status \"d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805\": rpc error: code = NotFound desc = could not find container \"d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805\": container with ID starting with d8904f2f2dfedb89eecf5e14dfae9144889834216ae8794a3df705f55676f805 not found: ID does not exist" Apr 16 18:04:49.282597 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.282578 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77bd68bc8b-btvvd"] Apr 16 18:04:49.286638 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:49.286618 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-77bd68bc8b-btvvd"] Apr 16 18:04:50.561249 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:50.561218 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="635e7d14-be03-47cc-ba03-fc37558d4103" path="/var/lib/kubelet/pods/635e7d14-be03-47cc-ba03-fc37558d4103/volumes" Apr 16 18:04:53.274416 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:53.274384 2578 generic.go:358] "Generic (PLEG): container finished" podID="e5b3a3bd-16e5-4438-837b-7f24def37fc3" containerID="1ca5728ca0413ca6a4b0cec69064cac6465b1a4178170db346368cae6cfbd220" exitCode=0 Apr 16 18:04:53.274771 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:53.274428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" event={"ID":"e5b3a3bd-16e5-4438-837b-7f24def37fc3","Type":"ContainerDied","Data":"1ca5728ca0413ca6a4b0cec69064cac6465b1a4178170db346368cae6cfbd220"} Apr 16 18:04:53.274771 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:53.274691 2578 scope.go:117] "RemoveContainer" containerID="1ca5728ca0413ca6a4b0cec69064cac6465b1a4178170db346368cae6cfbd220" Apr 16 18:04:54.278835 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:54.278797 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cmsdt" event={"ID":"e5b3a3bd-16e5-4438-837b-7f24def37fc3","Type":"ContainerStarted","Data":"bec4d6daa24b12420b00ac8a19036adb2bff13789573c0cd258b678e24bb064b"} Apr 16 18:04:56.236189 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:56.236146 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-464rm_fadab673-8208-44dd-8dbd-4cfd3f66947b/cluster-monitoring-operator/0.log" Apr 16 18:04:58.010854 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:58.010826 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m6mr7_d6d2180f-f27c-46d9-be7d-f6c5cd480f95/init-textfile/0.log" Apr 16 18:04:58.211442 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:58.211413 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m6mr7_d6d2180f-f27c-46d9-be7d-f6c5cd480f95/node-exporter/0.log" Apr 16 18:04:58.412010 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:04:58.411982 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m6mr7_d6d2180f-f27c-46d9-be7d-f6c5cd480f95/kube-rbac-proxy/0.log" Apr 16 18:05:01.453528 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:01.453491 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" podUID="d8511e45-12bd-403d-adca-66af780a5704" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:05:03.210478 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:03.210450 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:05:03.413606 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:03.413575 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/2.log" Apr 16 18:05:04.010469 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:04.010442 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-97bcf5c66-dbdrj_76309363-7d66-4131-9509-98c2fcc90649/router/0.log" Apr 16 18:05:04.615300 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:04.615272 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hddck_85ac3502-8838-441d-985c-4dd2dc6e803c/serve-healthcheck-canary/0.log" Apr 16 18:05:08.320087 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:08.320056 2578 generic.go:358] "Generic (PLEG): container finished" podID="4a741170-26d5-4d55-bd37-3e869323218c" containerID="1c1d9454872a5cfa99a5c732469b542f5702c960eb9cba9d70f4fc1de211f3d7" exitCode=0 Apr 16 18:05:08.320465 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:08.320110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" event={"ID":"4a741170-26d5-4d55-bd37-3e869323218c","Type":"ContainerDied","Data":"1c1d9454872a5cfa99a5c732469b542f5702c960eb9cba9d70f4fc1de211f3d7"} Apr 16 18:05:08.320465 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:08.320371 2578 scope.go:117] "RemoveContainer" containerID="1c1d9454872a5cfa99a5c732469b542f5702c960eb9cba9d70f4fc1de211f3d7" Apr 16 18:05:09.325614 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:09.325577 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nqmqg" event={"ID":"4a741170-26d5-4d55-bd37-3e869323218c","Type":"ContainerStarted","Data":"ee1938d9c9a36f373c3c1c8a0242b542d950ee02c60fc7fb72dfe9399d8bede3"} Apr 16 18:05:11.453437 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:11.453400 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" podUID="d8511e45-12bd-403d-adca-66af780a5704" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:05:21.453229 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:21.453191 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" podUID="d8511e45-12bd-403d-adca-66af780a5704" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:05:21.453666 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:21.453263 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" Apr 16 18:05:21.453772 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:21.453753 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"590674fe0cdef4b48b99996aa73c87644b6286685b64b425c978730fc9cbce20"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 18:05:21.453808 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:21.453796 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" podUID="d8511e45-12bd-403d-adca-66af780a5704" containerName="service-proxy" containerID="cri-o://590674fe0cdef4b48b99996aa73c87644b6286685b64b425c978730fc9cbce20" gracePeriod=30 Apr 16 18:05:22.363269 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:22.363238 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8511e45-12bd-403d-adca-66af780a5704" containerID="590674fe0cdef4b48b99996aa73c87644b6286685b64b425c978730fc9cbce20" exitCode=2 Apr 16 18:05:22.363432 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:22.363286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" event={"ID":"d8511e45-12bd-403d-adca-66af780a5704","Type":"ContainerDied","Data":"590674fe0cdef4b48b99996aa73c87644b6286685b64b425c978730fc9cbce20"} Apr 16 18:05:22.363432 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:05:22.363319 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c998fdf48-zdgvs" event={"ID":"d8511e45-12bd-403d-adca-66af780a5704","Type":"ContainerStarted","Data":"3dc7b1cec8ab220cac7e4fcb595499b94619631555f4e4065ac03fd03d744d7c"} Apr 16 18:07:48.473132 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:07:48.473100 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:07:48.473920 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:07:48.473897 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:07:48.478974 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:07:48.478952 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:07:48.479911 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:07:48.479887 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:07:48.484964 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:07:48.484948 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:12:48.493198 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:12:48.493170 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:12:48.493791 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:12:48.493772 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:12:48.498850 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:12:48.498828 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:12:48.499698 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:12:48.499675 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:17:48.517606 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:17:48.517573 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:17:48.519394 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:17:48.519374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:17:48.523074 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:17:48.523056 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:17:48.524614 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:17:48.524595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:22:48.535890 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:22:48.535859 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:22:48.538537 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:22:48.538517 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:22:48.541438 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:22:48.541418 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:22:48.546783 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:22:48.546766 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:27:48.557065 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:27:48.557034 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:27:48.562781 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:27:48.562756 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:27:48.563636 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:27:48.563604 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:27:48.573756 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:27:48.573736 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:32:48.581353 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:32:48.581327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:32:48.586654 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:32:48.586633 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:32:48.586654 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:32:48.586643 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:32:48.592457 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:32:48.592440 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:37:48.599636 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:37:48.599603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:37:48.605229 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:37:48.605207 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:37:48.605362 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:37:48.605343 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:37:48.610682 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:37:48.610664 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:42:48.618074 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:42:48.618047 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:42:48.625723 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:42:48.625703 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:42:48.630180 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:42:48.630146 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:42:48.635495 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:42:48.635477 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:47:48.638775 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:47:48.638706 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:47:48.647104 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:47:48.647080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:47:48.652277 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:47:48.652257 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:47:48.657337 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:47:48.657320 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:52:48.660230 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:52:48.660201 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:52:48.666082 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:52:48.666062 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:52:48.675632 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:52:48.675609 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:52:48.681093 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:52:48.681076 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:57:48.680093 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:57:48.680068 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:57:48.685404 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:57:48.685384 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 18:57:48.694065 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:57:48.694045 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 18:57:48.701740 ip-10-0-142-167 kubenswrapper[2578]: I0416 18:57:48.701722 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 19:02:48.698388 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:02:48.698361 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 19:02:48.703754 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:02:48.703736 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 19:02:48.714837 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:02:48.714814 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 19:02:48.720102 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:02:48.720087 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 19:07:26.077720 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:26.077688 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-28sv4_6fd513cc-2c53-4020-94b3-faf51a11b03f/global-pull-secret-syncer/0.log" Apr 16 19:07:26.347282 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:26.347214 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zlrpl_47f30843-f6cf-4b8f-97fc-e1c1b5c83d63/konnectivity-agent/0.log" Apr 16 19:07:26.430619 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:26.430592 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-167.ec2.internal_f3d3ab7bc151663feafe988605589632/haproxy/0.log" Apr 16 19:07:29.944954 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:29.944886 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-464rm_fadab673-8208-44dd-8dbd-4cfd3f66947b/cluster-monitoring-operator/0.log" Apr 16 19:07:30.332562 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:30.332540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m6mr7_d6d2180f-f27c-46d9-be7d-f6c5cd480f95/node-exporter/0.log" Apr 16 19:07:30.359127 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:30.359107 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m6mr7_d6d2180f-f27c-46d9-be7d-f6c5cd480f95/kube-rbac-proxy/0.log" Apr 16 19:07:30.385614 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:30.385598 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m6mr7_d6d2180f-f27c-46d9-be7d-f6c5cd480f95/init-textfile/0.log" Apr 16 19:07:32.781747 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:32.781718 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 19:07:32.786799 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:32.786775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/2.log" Apr 16 19:07:33.351538 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.351509 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m"] Apr 16 19:07:33.351842 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.351826 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="635e7d14-be03-47cc-ba03-fc37558d4103" containerName="registry" Apr 16 19:07:33.351918 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.351844 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="635e7d14-be03-47cc-ba03-fc37558d4103" containerName="registry" Apr 16 19:07:33.351918 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.351913 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="635e7d14-be03-47cc-ba03-fc37558d4103" containerName="registry" Apr 16 19:07:33.354484 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.354463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.357197 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.357179 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s228v\"/\"openshift-service-ca.crt\"" Apr 16 19:07:33.357399 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.357387 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-s228v\"/\"default-dockercfg-8h7sl\"" Apr 16 19:07:33.358423 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.358405 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s228v\"/\"kube-root-ca.crt\"" Apr 16 19:07:33.366302 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.366283 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m"] Apr 16 19:07:33.418027 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.418004 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-proc\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.418127 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.418032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-lib-modules\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.418127 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.418060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9mx\" (UniqueName: \"kubernetes.io/projected/c74dd666-bc1f-4880-bcc8-25fda4191137-kube-api-access-bs9mx\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.418226 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.418185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-podres\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.418226 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.418213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-sys\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519404 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-proc\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519492 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519410 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-lib-modules\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519492 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9mx\" (UniqueName: \"kubernetes.io/projected/c74dd666-bc1f-4880-bcc8-25fda4191137-kube-api-access-bs9mx\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519492 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-podres\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519604 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-proc\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519604 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-sys\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519604 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-lib-modules\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519604 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519600 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-podres\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.519767 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.519661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c74dd666-bc1f-4880-bcc8-25fda4191137-sys\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.528333 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.528305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9mx\" (UniqueName: \"kubernetes.io/projected/c74dd666-bc1f-4880-bcc8-25fda4191137-kube-api-access-bs9mx\") pod \"perf-node-gather-daemonset-r9m8m\" (UID: \"c74dd666-bc1f-4880-bcc8-25fda4191137\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.657854 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.657798 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-xc7ff_9dd185dc-97ca-4edf-a263-a8115564eb69/volume-data-source-validator/0.log" Apr 16 19:07:33.663896 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.663877 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:33.783334 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.783313 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m"] Apr 16 19:07:33.785608 ip-10-0-142-167 kubenswrapper[2578]: W0416 19:07:33.785580 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc74dd666_bc1f_4880_bcc8_25fda4191137.slice/crio-617d28441cfa32f16353195930a555f57e105f498377a990f231ae5d5dc32c3d WatchSource:0}: Error finding container 617d28441cfa32f16353195930a555f57e105f498377a990f231ae5d5dc32c3d: Status 404 returned error can't find the container with id 617d28441cfa32f16353195930a555f57e105f498377a990f231ae5d5dc32c3d Apr 16 19:07:33.787289 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:33.787274 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:07:34.422252 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:34.422227 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-c6tf9_44d91f6d-be25-4f64-af47-6cdda8f2bfb6/dns/0.log" Apr 16 19:07:34.446538 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:34.446518 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-c6tf9_44d91f6d-be25-4f64-af47-6cdda8f2bfb6/kube-rbac-proxy/0.log" Apr 16 19:07:34.550467 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:34.550441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" event={"ID":"c74dd666-bc1f-4880-bcc8-25fda4191137","Type":"ContainerStarted","Data":"be960738963d0b1d772d369044481b873ad7ca831f2b0831b2e2c7a82350215e"} Apr 16 19:07:34.550582 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:34.550471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" event={"ID":"c74dd666-bc1f-4880-bcc8-25fda4191137","Type":"ContainerStarted","Data":"617d28441cfa32f16353195930a555f57e105f498377a990f231ae5d5dc32c3d"} Apr 16 19:07:34.550582 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:34.550548 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:34.561869 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:34.561847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qnnb2_6497ce92-67e0-497d-b2db-ccc3571b7753/dns-node-resolver/0.log" Apr 16 19:07:34.584651 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:34.584620 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" podStartSLOduration=1.584608055 podStartE2EDuration="1.584608055s" podCreationTimestamp="2026-04-16 19:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:07:34.582614767 +0000 UTC m=+3886.586782775" watchObservedRunningTime="2026-04-16 19:07:34.584608055 +0000 UTC m=+3886.588776062" Apr 16 19:07:35.016653 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:35.016631 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dmn8s_60136db5-eb71-48af-b059-62d18f47a211/node-ca/0.log" Apr 16 19:07:35.798349 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:35.798317 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-97bcf5c66-dbdrj_76309363-7d66-4131-9509-98c2fcc90649/router/0.log" Apr 16 19:07:36.239124 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:36.239095 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hddck_85ac3502-8838-441d-985c-4dd2dc6e803c/serve-healthcheck-canary/0.log" Apr 16 19:07:36.617333 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:36.617306 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-cmsdt_e5b3a3bd-16e5-4438-837b-7f24def37fc3/insights-operator/0.log" Apr 16 19:07:36.619368 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:36.619350 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-cmsdt_e5b3a3bd-16e5-4438-837b-7f24def37fc3/insights-operator/1.log" Apr 16 19:07:36.720892 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:36.720870 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9mttl_ea1a3052-f4bc-4b70-a927-a2cf7dd3017a/kube-rbac-proxy/0.log" Apr 16 19:07:36.747296 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:36.747276 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9mttl_ea1a3052-f4bc-4b70-a927-a2cf7dd3017a/exporter/0.log" Apr 16 19:07:36.772978 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:36.772960 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9mttl_ea1a3052-f4bc-4b70-a927-a2cf7dd3017a/extractor/0.log" Apr 16 19:07:40.564013 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:40.563973 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-r9m8m" Apr 16 19:07:44.160994 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:44.160969 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-nqmqg_4a741170-26d5-4d55-bd37-3e869323218c/kube-storage-version-migrator-operator/1.log" Apr 16 19:07:44.161824 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:44.161806 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-nqmqg_4a741170-26d5-4d55-bd37-3e869323218c/kube-storage-version-migrator-operator/0.log" Apr 16 19:07:45.106282 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.106255 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7lp9m_8c8eed63-c467-428b-aa99-b72e120b58e9/kube-multus/0.log" Apr 16 19:07:45.334063 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.334038 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6d8n_3ee28cfd-b76c-488a-8374-405ee3a9a635/kube-multus-additional-cni-plugins/0.log" Apr 16 19:07:45.360512 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.360410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6d8n_3ee28cfd-b76c-488a-8374-405ee3a9a635/egress-router-binary-copy/0.log" Apr 16 19:07:45.386883 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.386862 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6d8n_3ee28cfd-b76c-488a-8374-405ee3a9a635/cni-plugins/0.log" Apr 16 19:07:45.411038 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.411019 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6d8n_3ee28cfd-b76c-488a-8374-405ee3a9a635/bond-cni-plugin/0.log" Apr 16 19:07:45.435225 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.435209 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6d8n_3ee28cfd-b76c-488a-8374-405ee3a9a635/routeoverride-cni/0.log" Apr 16 19:07:45.461011 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.460990 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6d8n_3ee28cfd-b76c-488a-8374-405ee3a9a635/whereabouts-cni-bincopy/0.log" Apr 16 19:07:45.486345 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.486326 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6d8n_3ee28cfd-b76c-488a-8374-405ee3a9a635/whereabouts-cni/0.log" Apr 16 19:07:45.900938 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.900917 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ndzmp_2f073ea3-db3b-4eaa-9a74-db58c9d97b21/network-metrics-daemon/0.log" Apr 16 19:07:45.926630 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:45.926610 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ndzmp_2f073ea3-db3b-4eaa-9a74-db58c9d97b21/kube-rbac-proxy/0.log" Apr 16 19:07:47.204182 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.204147 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-controller/0.log" Apr 16 19:07:47.227474 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.227452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 19:07:47.243928 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.243906 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/1.log" Apr 16 19:07:47.270127 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.270102 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/kube-rbac-proxy-node/0.log" Apr 16 19:07:47.299597 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.299578 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:07:47.329712 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.329694 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/northd/0.log" Apr 16 19:07:47.357054 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.357031 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/nbdb/0.log" Apr 16 19:07:47.387566 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.387550 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/sbdb/0.log" Apr 16 19:07:47.485466 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:47.485417 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovnkube-controller/0.log" Apr 16 19:07:48.716359 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:48.716330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 19:07:48.721690 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:48.721669 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 19:07:48.733760 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:48.733737 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4t7fq_bea2a2a8-4d21-4fd7-978c-0b7af8200cd7/console-operator/1.log" Apr 16 19:07:48.738926 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:48.738910 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsvsh_b3ae8fd0-5875-4adc-8994-9c74852c6397/ovn-acl-logging/0.log" Apr 16 19:07:48.746979 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:48.746963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-k687t_1581c4ae-cdee-4c3c-8bb7-c74cf465de66/check-endpoints/0.log" Apr 16 19:07:48.794860 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:48.794837 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6tmgb_a336e08f-92e1-4f5f-99d6-9f8231b01727/network-check-target-container/0.log" Apr 16 19:07:49.789683 ip-10-0-142-167 kubenswrapper[2578]: I0416 19:07:49.789659 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2bz8t_a0b63c55-85f3-4126-9cbf-dac101325a0b/iptables-alerter/0.log"