Apr 24 14:21:52.463871 ip-10-0-134-82 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 14:21:52.463884 ip-10-0-134-82 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 14:21:52.463896 ip-10-0-134-82 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 14:21:52.464232 ip-10-0-134-82 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 14:22:02.610093 ip-10-0-134-82 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 14:22:02.610111 ip-10-0-134-82 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0bd2a27e6c514f1a964ca7be8dc60bb3 -- Apr 24 14:24:36.620183 ip-10-0-134-82 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:24:37.078767 ip-10-0-134-82 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:37.078767 ip-10-0-134-82 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:24:37.078767 ip-10-0-134-82 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:37.078767 ip-10-0-134-82 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:24:37.078767 ip-10-0-134-82 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:37.079600 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.079509 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:24:37.083553 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083536 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:37.083553 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083553 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083557 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083560 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083563 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083568 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083573 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083577 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083580 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083582 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083585 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083588 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083591 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083593 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083596 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083599 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083602 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083604 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083608 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083612 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:37.083628 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083615 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083618 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083621 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083624 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083627 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083630 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083632 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083636 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083639 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083642 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083644 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083647 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083650 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083652 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083655 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083658 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083660 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083663 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083665 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:37.084150 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083668 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083671 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083674 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083676 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083679 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083681 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083684 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083686 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083689 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083692 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083694 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083697 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083699 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083702 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083704 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083708 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083711 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083714 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083717 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:37.084658 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083719 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083723 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083726 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083728 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083731 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083734 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083736 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083739 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083742 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083744 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083747 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083749 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083752 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083755 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083758 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083761 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083766 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083770 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083774 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083777 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:37.085145 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083780 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083782 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083785 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083788 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083791 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083794 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083797 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.083799 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084225 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084233 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084236 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084240 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084243 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084246 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084249 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084252 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084254 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084257 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084260 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084262 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:37.085626 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084266 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084268 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084272 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084275 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084279 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084282 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084284 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084287 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084289 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084291 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084294 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084296 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084300 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084303 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084305 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084309 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084312 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084314 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084317 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:37.086142 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084320 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084323 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084326 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084328 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084331 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084333 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084336 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084339 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084341 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084344 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084347 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084349 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084352 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084354 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084358 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084360 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084362 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084365 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084367 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084370 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:37.086663 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084372 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084375 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084377 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084380 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084382 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084385 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084388 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084390 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084393 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084395 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084398 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084400 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084403 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084406 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084408 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084411 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084414 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084417 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084420 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084422 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:37.087187 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084425 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084427 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084430 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084432 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084435 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084437 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084440 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084443 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084445 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084448 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084450 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084452 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084455 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084457 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.084460 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084532 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084541 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084547 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084552 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084557 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084560 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:24:37.087701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084565 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084569 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084572 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084575 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084579 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084583 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084586 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084589 2577 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084592 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084595 2577 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084598 2577 flags.go:64] FLAG: --cloud-config="" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084601 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084604 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084608 2577 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084612 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084615 2577 flags.go:64] FLAG: --config-dir="" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084617 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084621 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084625 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084629 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084632 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084635 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084638 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084641 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:24:37.088239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084643 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084646 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084649 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084653 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084657 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084660 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084663 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084667 2577 flags.go:64] FLAG: --enable-server="true" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084670 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084674 2577 flags.go:64] FLAG: --event-burst="100" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084678 2577 flags.go:64] FLAG: --event-qps="50" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084681 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084688 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084691 2577 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084696 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084699 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084702 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084705 2577 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084708 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084711 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084714 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084717 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084720 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084723 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084726 2577 flags.go:64] FLAG: --feature-gates="" Apr 24 14:24:37.088827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084729 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084732 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084735 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084739 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084743 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084746 2577 flags.go:64] FLAG: --help="false" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084748 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084752 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084755 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084758 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084761 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084764 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084767 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084771 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084774 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084777 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084780 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084783 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084785 2577 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084790 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084793 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084796 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084799 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084802 2577 flags.go:64] FLAG: --lock-file="" Apr 24 14:24:37.089465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084804 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084807 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084810 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084816 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084819 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084822 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084827 2577 flags.go:64] FLAG: --logging-format="text" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084830 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084833 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084836 2577 flags.go:64] FLAG: --manifest-url="" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084839 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084844 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084848 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084852 2577 flags.go:64] FLAG: --max-pods="110" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084855 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084858 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084860 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084878 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084882 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084885 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084888 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084895 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084898 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084901 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:24:37.090090 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084904 2577 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084907 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084913 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084916 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084921 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084924 2577 flags.go:64] FLAG: --port="10250" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084927 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084930 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02ba22d52e46f602a" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084933 2577 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084936 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084939 2577 flags.go:64] FLAG: --register-node="true" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084942 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084945 2577 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084949 2577 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084958 2577 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084961 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084964 2577 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084967 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084970 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084973 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084976 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084980 2577 flags.go:64] FLAG: --runonce="false" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084983 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084986 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084989 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:24:37.090681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084993 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084996 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.084999 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085002 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085005 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085008 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085011 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085014 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085017 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085020 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085022 2577 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085027 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085032 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085035 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085038 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085042 2577 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085045 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085048 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085051 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085054 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085057 2577 flags.go:64] FLAG: --v="2" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085063 2577 flags.go:64] FLAG: --version="false" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085067 2577 flags.go:64] FLAG: --vmodule="" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085071 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:24:37.091322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.085075 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085190 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085195 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085199 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085202 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085210 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085213 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085216 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085219 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085221 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085224 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085226 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085229 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085232 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085235 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085237 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085240 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085242 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085245 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:37.091986 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085249 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085251 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085254 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085257 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085259 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085262 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085264 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085267 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085269 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085272 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085276 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085278 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085281 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085283 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085286 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085289 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085291 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085294 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085296 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085304 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:37.092487 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085307 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085310 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085312 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085315 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085318 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085320 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085323 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085325 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085328 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085331 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085333 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085336 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085340 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085342 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085345 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085348 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085350 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085353 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085355 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085358 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:37.093013 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085360 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085363 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085367 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085370 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085372 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085375 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085378 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085380 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085383 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085385 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085388 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085391 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085400 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085403 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085406 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085410 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085413 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085417 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085420 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:37.093560 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085428 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085431 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085434 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085436 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085439 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085443 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085446 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085449 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.085452 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.086511 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.093082 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.093201 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093261 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093267 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093270 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:37.094073 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093273 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093276 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093279 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093282 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093285 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093287 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093290 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093293 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093296 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093300 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093305 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093309 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093312 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093315 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093318 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093322 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093324 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093327 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093330 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:37.094457 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093332 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093335 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093338 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093340 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093343 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093346 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093348 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093350 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093353 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093357 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093360 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093363 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093366 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093369 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093371 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093374 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093378 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093382 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093384 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:37.095028 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093387 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093389 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093392 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093395 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093397 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093400 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093403 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093406 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093408 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093410 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093413 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093416 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093418 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093421 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093423 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093426 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093428 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093431 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093434 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093436 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:37.095488 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093439 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093441 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093445 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093448 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093450 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093453 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093456 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093458 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093461 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093463 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093466 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093468 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093471 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093473 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093476 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093478 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093481 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093483 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093486 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093489 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:37.095983 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093492 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093494 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093497 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093499 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093502 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.093507 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093605 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093610 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093613 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093615 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093618 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093621 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093623 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093626 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093630 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:37.096506 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093633 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093635 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093638 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093641 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093643 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093646 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093649 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093651 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093654 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093657 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093659 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093662 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093664 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093667 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093670 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093672 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093675 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093678 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093680 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:37.096942 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093683 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093686 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093688 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093690 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093693 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093695 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093698 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093700 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093702 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093705 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093709 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093712 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093715 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093718 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093721 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093724 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093727 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093729 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093732 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:37.097403 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093734 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093737 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093741 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093744 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093747 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093749 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093752 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093754 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093757 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093759 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093762 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093765 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093768 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093770 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093773 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093775 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093777 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093780 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093782 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093785 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:37.098020 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093787 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093790 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093792 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093795 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093797 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093800 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093803 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093806 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093808 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093811 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093814 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093816 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093819 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093821 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093824 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093826 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093829 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093831 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:37.098585 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:37.093834 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:37.099097 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.093839 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:37.099097 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.094565 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:24:37.099097 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.096526 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:24:37.099097 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.097477 2577 server.go:1019] "Starting client certificate rotation" Apr 24 14:24:37.099097 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.097595 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:37.099097 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.097650 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:37.123676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.123651 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:37.130726 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.130701 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:37.146015 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.145995 2577 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:24:37.152092 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.152073 2577 log.go:25] "Validated CRI v1 image API" Apr 24 14:24:37.153500 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.153478 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:24:37.159839 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.159816 2577 fs.go:135] Filesystem UUIDs: map[4e714b6b-4532-494b-95bd-adbdaf8d756e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c9f78a7b-a3e2-409d-a508-2c1f7d5d203a:/dev/nvme0n1p3] Apr 24 14:24:37.159946 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.159838 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:24:37.161081 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.161060 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:37.165898 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.165769 2577 manager.go:217] Machine: {Timestamp:2026-04-24 14:24:37.163854981 +0000 UTC m=+0.421756998 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092951 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b9a27f0a7ee0e5ab6551d811ff516 SystemUUID:ec2b9a27-f0a7-ee0e-5ab6-551d811ff516 BootID:0bd2a27e-6c51-4f1a-964c-a7be8dc60bb3 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:06:53:68:65:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:06:53:68:65:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:6c:b7:94:85:9f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:24:37.165898 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.165893 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:24:37.166040 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.166028 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:24:37.167196 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.167169 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:24:37.167347 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.167199 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-82.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:24:37.167394 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.167356 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:24:37.167394 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.167365 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:24:37.167394 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.167378 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:37.168981 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.168971 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:37.170415 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.170405 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:37.170534 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.170525 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:24:37.173041 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.173031 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:24:37.173076 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.173048 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:24:37.173076 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.173069 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:24:37.173148 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.173079 2577 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:24:37.173148 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.173093 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:24:37.174244 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.174230 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:37.174244 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.174248 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:37.178728 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.178707 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:24:37.180093 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.180080 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:24:37.181914 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181901 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:24:37.181968 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181922 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:24:37.181968 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181932 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:24:37.181968 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181941 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:24:37.181968 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181949 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:24:37.181968 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181957 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:24:37.181968 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181966 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:24:37.182124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181974 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:24:37.182124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181984 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:24:37.182124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.181993 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:24:37.182124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.182016 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:24:37.182124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.182029 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:24:37.182988 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.182976 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:24:37.183032 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.182990 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:24:37.183212 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.183195 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5nbb2" Apr 24 14:24:37.184559 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.184521 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-82.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:24:37.184642 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.184623 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:24:37.186881 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.186855 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:24:37.186939 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.186906 2577 server.go:1295] "Started kubelet" Apr 24 14:24:37.187005 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.186982 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:24:37.187096 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.187061 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:24:37.187145 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.187122 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:24:37.187827 ip-10-0-134-82 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:24:37.188448 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.188232 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:24:37.189918 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.189900 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:24:37.191067 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.191050 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5nbb2" Apr 24 14:24:37.192850 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.192832 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-82.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:24:37.195629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.195608 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:37.196227 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.196210 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:24:37.196903 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.196887 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:24:37.197105 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197085 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:24:37.197186 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197109 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:24:37.197260 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197248 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:24:37.197311 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197262 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:24:37.197414 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.197386 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.197501 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197477 2577 factory.go:55] Registering systemd factory Apr 24 14:24:37.197501 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197497 2577 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:24:37.197724 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197706 2577 factory.go:153] Registering CRI-O factory Apr 24 14:24:37.197724 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197720 2577 factory.go:223] Registration of the crio container factory successfully Apr 24 14:24:37.197923 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197769 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:24:37.197923 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197795 2577 factory.go:103] Registering Raw factory Apr 24 14:24:37.197923 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.197807 2577 manager.go:1196] Started watching for new ooms in manager Apr 24 14:24:37.198243 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.198217 2577 manager.go:319] Starting recovery of all containers Apr 24 14:24:37.199287 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.199241 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:24:37.202384 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.202358 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:37.206010 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.205989 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-82.ec2.internal\" not found" node="ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.209555 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.209538 2577 manager.go:324] Recovery completed Apr 24 14:24:37.211657 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.211628 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 14:24:37.214509 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.214495 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:37.216851 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.216837 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:37.216920 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.216880 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:37.216920 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.216895 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:37.217364 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.217350 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:24:37.217364 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.217362 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:24:37.217463 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.217380 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:37.219835 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.219823 2577 policy_none.go:49] "None policy: Start" Apr 24 14:24:37.219917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.219838 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:24:37.219917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.219848 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:24:37.256474 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.256456 2577 manager.go:341] "Starting Device Plugin manager" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.256528 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.256538 2577 server.go:85] "Starting device plugin registration server" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.256793 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.256809 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.256908 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.256999 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.257012 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.257749 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:24:37.260215 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.257783 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.324164 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.324117 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:24:37.325275 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.325256 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:24:37.325409 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.325284 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:24:37.325409 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.325304 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:24:37.325409 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.325311 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:24:37.325409 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.325401 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:24:37.327138 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.327118 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:37.357188 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.357124 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:37.358309 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.358292 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:37.358398 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.358322 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:37.358398 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.358336 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:37.358398 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.358360 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.367975 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.367955 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.368042 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.367982 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-82.ec2.internal\": node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.381683 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.381664 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.426462 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.426428 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal"] Apr 24 14:24:37.426559 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.426520 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:37.428551 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.428536 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:37.428630 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.428566 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:37.428630 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.428581 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:37.429986 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.429972 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:37.430139 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.430182 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430160 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:37.430720 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430699 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:37.430831 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430729 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:37.430831 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430739 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:37.430831 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430699 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:37.430831 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430817 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:37.431054 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.430833 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:37.432007 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.431991 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.432060 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.432025 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:37.432739 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.432725 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:37.432823 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.432754 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:37.432823 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.432768 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:37.456645 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.456621 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-82.ec2.internal\" not found" node="ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.461173 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.461157 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-82.ec2.internal\" not found" node="ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.481911 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.481883 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.498752 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.498726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/52bd46895be952cff06d137a21c827f1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal\" (UID: \"52bd46895be952cff06d137a21c827f1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.498840 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.498756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52bd46895be952cff06d137a21c827f1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal\" (UID: \"52bd46895be952cff06d137a21c827f1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.498840 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.498780 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/20602652b465b1663ffa2d71aa406f85-config\") pod \"kube-apiserver-proxy-ip-10-0-134-82.ec2.internal\" (UID: \"20602652b465b1663ffa2d71aa406f85\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.582959 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.582920 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.599282 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.599260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52bd46895be952cff06d137a21c827f1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal\" (UID: \"52bd46895be952cff06d137a21c827f1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.599333 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.599292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/20602652b465b1663ffa2d71aa406f85-config\") pod \"kube-apiserver-proxy-ip-10-0-134-82.ec2.internal\" (UID: \"20602652b465b1663ffa2d71aa406f85\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.599333 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.599317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/52bd46895be952cff06d137a21c827f1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal\" (UID: \"52bd46895be952cff06d137a21c827f1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.599399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.599344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/52bd46895be952cff06d137a21c827f1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal\" (UID: \"52bd46895be952cff06d137a21c827f1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.599399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.599368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/20602652b465b1663ffa2d71aa406f85-config\") pod \"kube-apiserver-proxy-ip-10-0-134-82.ec2.internal\" (UID: \"20602652b465b1663ffa2d71aa406f85\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.599399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.599344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52bd46895be952cff06d137a21c827f1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal\" (UID: \"52bd46895be952cff06d137a21c827f1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.683651 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.683574 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.759050 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.759023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.763757 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:37.763739 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" Apr 24 14:24:37.784300 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.784269 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.884802 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.884748 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:37.985363 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:37.985279 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:38.085918 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:38.085884 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:38.097047 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.097022 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:24:38.097183 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.097167 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:38.097219 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.097193 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:38.186841 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:38.186807 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:38.192949 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.192918 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:19:37 +0000 UTC" deadline="2027-10-01 02:33:26.481398545 +0000 UTC" Apr 24 14:24:38.192949 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.192947 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12588h8m48.288455322s" Apr 24 14:24:38.196372 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.196351 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:38.216911 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.216880 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:38.229721 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:38.229694 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52bd46895be952cff06d137a21c827f1.slice/crio-bf2949ff77dda4c4d024cd8352b3b023191badf417d2b8b094f4c604140cf489 WatchSource:0}: Error finding container bf2949ff77dda4c4d024cd8352b3b023191badf417d2b8b094f4c604140cf489: Status 404 returned error can't find the container with id bf2949ff77dda4c4d024cd8352b3b023191badf417d2b8b094f4c604140cf489 Apr 24 14:24:38.230583 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:38.230554 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20602652b465b1663ffa2d71aa406f85.slice/crio-abaf054d36b8911806d41f8330fb0bd404d216d1f913ed647869e57e88bef794 WatchSource:0}: Error finding container abaf054d36b8911806d41f8330fb0bd404d216d1f913ed647869e57e88bef794: Status 404 returned error can't find the container with id abaf054d36b8911806d41f8330fb0bd404d216d1f913ed647869e57e88bef794 Apr 24 14:24:38.234801 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.234785 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:24:38.242827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.242805 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-l6sk7" Apr 24 14:24:38.250768 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.250750 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-l6sk7" Apr 24 14:24:38.287328 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:38.287287 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:38.328846 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.328796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" event={"ID":"52bd46895be952cff06d137a21c827f1","Type":"ContainerStarted","Data":"bf2949ff77dda4c4d024cd8352b3b023191badf417d2b8b094f4c604140cf489"} Apr 24 14:24:38.329705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.329684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" event={"ID":"20602652b465b1663ffa2d71aa406f85","Type":"ContainerStarted","Data":"abaf054d36b8911806d41f8330fb0bd404d216d1f913ed647869e57e88bef794"} Apr 24 14:24:38.387922 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:38.387849 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:38.488495 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:38.488405 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-82.ec2.internal\" not found" Apr 24 14:24:38.507415 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.507376 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:38.597149 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.597116 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" Apr 24 14:24:38.606295 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.606268 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:38.608097 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.608074 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" Apr 24 14:24:38.609402 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.609382 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:38.618447 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.618332 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:38.900961 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:38.900886 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:39.174464 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.174141 2577 apiserver.go:52] "Watching apiserver" Apr 24 14:24:39.183113 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.183080 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:24:39.183499 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.183474 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7nph4","openshift-multus/multus-additional-cni-plugins-kplkp","openshift-multus/network-metrics-daemon-49nxb","openshift-network-diagnostics/network-check-target-rn5ql","openshift-network-operator/iptables-alerter-4fxh7","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd","openshift-cluster-node-tuning-operator/tuned-h7rhc","openshift-dns/node-resolver-w4bb5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal","openshift-multus/multus-p6pq2","openshift-ovn-kubernetes/ovnkube-node-tdc2t","kube-system/konnectivity-agent-wt2bk","kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal"] Apr 24 14:24:39.185739 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.185702 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.187016 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.186990 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.188072 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.188052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:39.188180 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.188128 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:39.188843 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.188820 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6lvp6\"" Apr 24 14:24:39.189010 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.188993 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.189060 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.188995 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.189363 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.189345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:39.189451 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.189361 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:24:39.189451 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.189403 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:39.189547 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.189501 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:24:39.189547 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.189523 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.189816 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.189798 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9fsgn\"" Apr 24 14:24:39.189925 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.189821 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:24:39.189925 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.189823 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.190846 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.190829 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.192217 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.192198 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.193283 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.193255 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.193374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.193294 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.193374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.193326 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:24:39.193484 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.193385 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dtb4c\"" Apr 24 14:24:39.193685 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.193667 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.194601 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.194580 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.194673 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.194610 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rxp5q\"" Apr 24 14:24:39.194673 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.194627 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.194673 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.194636 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:24:39.194933 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.194909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.196110 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.196090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.196201 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.196169 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.196201 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.196180 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:24:39.196498 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.196478 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6brmq\"" Apr 24 14:24:39.196579 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.196479 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.197270 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.197181 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.197368 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.197276 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x9zq7\"" Apr 24 14:24:39.197652 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.197571 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.197652 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.197642 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.199235 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.199055 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:24:39.199235 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.199159 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hzwhh\"" Apr 24 14:24:39.200936 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.199599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.200936 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.200831 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:24:39.201085 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.200987 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.201662 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.201278 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l8zq2\"" Apr 24 14:24:39.201662 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.201469 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:24:39.201662 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.201469 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.201662 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.201512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:24:39.202321 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.202301 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:24:39.203098 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.203070 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:24:39.203400 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.203227 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:24:39.203539 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.203515 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pkpvs\"" Apr 24 14:24:39.207046 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ace694b-9bcf-445d-8e37-b1371853f469-etc-tuned\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.207159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-systemd-units\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-system-cni-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.207159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-cni-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.207159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-os-release\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.207355 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-env-overrides\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207355 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovnkube-script-lib\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207355 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-tmp-dir\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.207355 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207355 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-run-netns\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-systemd\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207420 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrl5x\" (UniqueName: \"kubernetes.io/projected/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-kube-api-access-wrl5x\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9dfx\" (UniqueName: \"kubernetes.io/projected/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-kube-api-access-z9dfx\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207464 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-cni-binary-copy\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-modprobe-d\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysconfig\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.207580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-kubernetes\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-run\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-conf-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-daemon-config\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-hosts-file\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207771 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysctl-conf\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-log-socket\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-run-ovn-kubernetes\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c100dd1e-57a3-471e-998a-d002af692c13-iptables-alerter-script\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-kubelet\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-socket-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-node-log\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.207955 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-cni-bin\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.207998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-var-lib-kubelet\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbg2h\" (UniqueName: \"kubernetes.io/projected/66176040-5bda-46d8-aaba-ef37c25ad37e-kube-api-access-sbg2h\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-device-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c100dd1e-57a3-471e-998a-d002af692c13-host-slash\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-etc-kubernetes\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh47j\" (UniqueName: \"kubernetes.io/projected/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-kube-api-access-hh47j\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysctl-d\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208260 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-slash\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-cnibin\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-k8s-cni-cncf-io\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-systemd\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-cnibin\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.208374 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-etc-selinux\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-sys\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovnkube-config\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-var-lib-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-os-release\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208491 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-netns\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-cni-binary-copy\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-ovn\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-cni-bin\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-host\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77dzp\" (UniqueName: \"kubernetes.io/projected/e33e5322-8c6e-4112-b027-ca7e081d534b-kube-api-access-77dzp\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovn-node-metrics-cert\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkvj\" (UniqueName: \"kubernetes.io/projected/48c500e5-ff8b-4e0c-bdda-745035b2e024-kube-api-access-hxkvj\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhkwf\" (UniqueName: \"kubernetes.io/projected/c100dd1e-57a3-471e-998a-d002af692c13-kube-api-access-xhkwf\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-host\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-serviceca\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-multus-certs\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.208972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-lib-modules\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-kubelet\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-socket-dir-parent\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-cni-multus\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.208977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ace694b-9bcf-445d-8e37-b1371853f469-tmp\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-259md\" (UniqueName: \"kubernetes.io/projected/8ace694b-9bcf-445d-8e37-b1371853f469-kube-api-access-259md\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-system-cni-dir\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2677\" (UniqueName: \"kubernetes.io/projected/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-kube-api-access-s2677\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-registration-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-sys-fs\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-etc-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-cni-netd\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.209705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.209305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-hostroot\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.233092 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.233072 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:39.251927 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.251894 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:38 +0000 UTC" deadline="2027-11-10 00:42:58.164587508 +0000 UTC" Apr 24 14:24:39.252018 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.251928 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13546h18m18.912662702s" Apr 24 14:24:39.298464 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.298437 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:24:39.309736 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:39.309736 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-hosts-file\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.309970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysctl-conf\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.309970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-log-socket\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.309970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-run-ovn-kubernetes\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.309970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-log-socket\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.309970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysctl-conf\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.309970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-hosts-file\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.309970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.309954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c100dd1e-57a3-471e-998a-d002af692c13-iptables-alerter-script\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-run-ovn-kubernetes\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-kubelet\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-socket-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-node-log\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-cni-bin\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-kubelet\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310192 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-var-lib-kubelet\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-node-log\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-cni-bin\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-socket-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.310310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbg2h\" (UniqueName: \"kubernetes.io/projected/66176040-5bda-46d8-aaba-ef37c25ad37e-kube-api-access-sbg2h\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.310332 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-device-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c100dd1e-57a3-471e-998a-d002af692c13-host-slash\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.310455 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:39.810420314 +0000 UTC m=+3.068322341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c100dd1e-57a3-471e-998a-d002af692c13-host-slash\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-var-lib-kubelet\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c100dd1e-57a3-471e-998a-d002af692c13-iptables-alerter-script\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-etc-kubernetes\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310655 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-device-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-etc-kubernetes\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh47j\" (UniqueName: \"kubernetes.io/projected/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-kube-api-access-hh47j\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysctl-d\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-slash\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.310954 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.310861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysctl-d\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-slash\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-cnibin\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-k8s-cni-cncf-io\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-systemd\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-cnibin\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-etc-selinux\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-sys\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-k8s-cni-cncf-io\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovnkube-config\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-systemd\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-var-lib-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-os-release\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-cnibin\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-cnibin\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-etc-selinux\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-netns\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-sys\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.311580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-netns\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-os-release\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311496 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-var-lib-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-cni-binary-copy\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-ovn\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-cni-bin\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-ovn\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-host\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77dzp\" (UniqueName: \"kubernetes.io/projected/e33e5322-8c6e-4112-b027-ca7e081d534b-kube-api-access-77dzp\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-cni-bin\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovn-node-metrics-cert\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkvj\" (UniqueName: \"kubernetes.io/projected/48c500e5-ff8b-4e0c-bdda-745035b2e024-kube-api-access-hxkvj\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-host\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhkwf\" (UniqueName: \"kubernetes.io/projected/c100dd1e-57a3-471e-998a-d002af692c13-kube-api-access-xhkwf\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-host\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-serviceca\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-multus-certs\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-lib-modules\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.312399 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-kubelet\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-socket-dir-parent\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.311978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-cni-multus\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovnkube-config\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ace694b-9bcf-445d-8e37-b1371853f469-tmp\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312076 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-cni-binary-copy\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-259md\" (UniqueName: \"kubernetes.io/projected/8ace694b-9bcf-445d-8e37-b1371853f469-kube-api-access-259md\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-system-cni-dir\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-lib-modules\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-socket-dir-parent\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2677\" (UniqueName: \"kubernetes.io/projected/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-kube-api-access-s2677\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312223 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-host\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-kubelet\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-system-cni-dir\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-run-multus-certs\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.313177 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-registration-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312464 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-host-var-lib-cni-multus\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-sys-fs\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312533 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-registration-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-etc-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-cni-netd\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-hostroot\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ace694b-9bcf-445d-8e37-b1371853f469-etc-tuned\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-serviceca\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-systemd-units\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-hostroot\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-sys-fs\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-etc-openvswitch\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.313890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-cni-netd\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-systemd-units\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-system-cni-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-cni-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.312827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-os-release\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.313806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-env-overrides\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.313979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66176040-5bda-46d8-aaba-ef37c25ad37e-os-release\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-system-cni-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-env-overrides\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovnkube-script-lib\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-tmp-dir\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9bb73f96-473b-43fe-89d5-d7ba4f64faf2-agent-certs\") pod \"konnectivity-agent-wt2bk\" (UID: \"9bb73f96-473b-43fe-89d5-d7ba4f64faf2\") " pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9bb73f96-473b-43fe-89d5-d7ba4f64faf2-konnectivity-ca\") pod \"konnectivity-agent-wt2bk\" (UID: \"9bb73f96-473b-43fe-89d5-d7ba4f64faf2\") " pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-run-netns\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-systemd\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.314622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrl5x\" (UniqueName: \"kubernetes.io/projected/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-kube-api-access-wrl5x\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9dfx\" (UniqueName: \"kubernetes.io/projected/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-kube-api-access-z9dfx\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-cni-binary-copy\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-modprobe-d\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysconfig\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-kubernetes\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-run\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-conf-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.315346 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.314773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-daemon-config\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.315679 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.315394 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-daemon-config\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.315679 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.315650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-cni-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.316124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.315814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ace694b-9bcf-445d-8e37-b1371853f469-etc-tuned\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.316124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.315942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ace694b-9bcf-445d-8e37-b1371853f469-tmp\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.316124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.315994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovnkube-script-lib\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.316124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/66176040-5bda-46d8-aaba-ef37c25ad37e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.316124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.316124 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48c500e5-ff8b-4e0c-bdda-745035b2e024-ovn-node-metrics-cert\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-modprobe-d\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-kubernetes\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-run-systemd\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-etc-sysconfig\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48c500e5-ff8b-4e0c-bdda-745035b2e024-host-run-netns\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ace694b-9bcf-445d-8e37-b1371853f469-run\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316362 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e33e5322-8c6e-4112-b027-ca7e081d534b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.316687 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-multus-conf-dir\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.318204 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-cni-binary-copy\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.318204 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.316994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-tmp-dir\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.321797 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.321774 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:39.321797 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.321801 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:39.321993 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.321816 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dg29k for pod openshift-network-diagnostics/network-check-target-rn5ql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:39.321993 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.321890 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k podName:442ed584-8835-435d-8b83-97804ed0f554 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:39.821861201 +0000 UTC m=+3.079763206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dg29k" (UniqueName: "kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k") pod "network-check-target-rn5ql" (UID: "442ed584-8835-435d-8b83-97804ed0f554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:39.331972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.331938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhkwf\" (UniqueName: \"kubernetes.io/projected/c100dd1e-57a3-471e-998a-d002af692c13-kube-api-access-xhkwf\") pod \"iptables-alerter-4fxh7\" (UID: \"c100dd1e-57a3-471e-998a-d002af692c13\") " pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.332504 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.332369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrl5x\" (UniqueName: \"kubernetes.io/projected/9bac24c0-4d8b-4f25-88b3-6d4cebc649bf-kube-api-access-wrl5x\") pod \"node-ca-7nph4\" (UID: \"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf\") " pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.332771 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.332750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9dfx\" (UniqueName: \"kubernetes.io/projected/5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8-kube-api-access-z9dfx\") pod \"node-resolver-w4bb5\" (UID: \"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8\") " pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.334628 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.334601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkvj\" (UniqueName: \"kubernetes.io/projected/48c500e5-ff8b-4e0c-bdda-745035b2e024-kube-api-access-hxkvj\") pod \"ovnkube-node-tdc2t\" (UID: \"48c500e5-ff8b-4e0c-bdda-745035b2e024\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.335283 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.335253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77dzp\" (UniqueName: \"kubernetes.io/projected/e33e5322-8c6e-4112-b027-ca7e081d534b-kube-api-access-77dzp\") pod \"aws-ebs-csi-driver-node-lv7hd\" (UID: \"e33e5322-8c6e-4112-b027-ca7e081d534b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.335497 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.335469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2677\" (UniqueName: \"kubernetes.io/projected/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-kube-api-access-s2677\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:39.335620 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.335599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-259md\" (UniqueName: \"kubernetes.io/projected/8ace694b-9bcf-445d-8e37-b1371853f469-kube-api-access-259md\") pod \"tuned-h7rhc\" (UID: \"8ace694b-9bcf-445d-8e37-b1371853f469\") " pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.335979 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.335932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh47j\" (UniqueName: \"kubernetes.io/projected/0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67-kube-api-access-hh47j\") pod \"multus-p6pq2\" (UID: \"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67\") " pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.336311 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.336288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbg2h\" (UniqueName: \"kubernetes.io/projected/66176040-5bda-46d8-aaba-ef37c25ad37e-kube-api-access-sbg2h\") pod \"multus-additional-cni-plugins-kplkp\" (UID: \"66176040-5bda-46d8-aaba-ef37c25ad37e\") " pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.415160 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.415126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9bb73f96-473b-43fe-89d5-d7ba4f64faf2-agent-certs\") pod \"konnectivity-agent-wt2bk\" (UID: \"9bb73f96-473b-43fe-89d5-d7ba4f64faf2\") " pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.415160 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.415166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9bb73f96-473b-43fe-89d5-d7ba4f64faf2-konnectivity-ca\") pod \"konnectivity-agent-wt2bk\" (UID: \"9bb73f96-473b-43fe-89d5-d7ba4f64faf2\") " pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.415878 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.415835 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9bb73f96-473b-43fe-89d5-d7ba4f64faf2-konnectivity-ca\") pod \"konnectivity-agent-wt2bk\" (UID: \"9bb73f96-473b-43fe-89d5-d7ba4f64faf2\") " pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.417591 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.417568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9bb73f96-473b-43fe-89d5-d7ba4f64faf2-agent-certs\") pod \"konnectivity-agent-wt2bk\" (UID: \"9bb73f96-473b-43fe-89d5-d7ba4f64faf2\") " pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.498296 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.498182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" Apr 24 14:24:39.507980 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.507959 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kplkp" Apr 24 14:24:39.516622 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.516598 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4fxh7" Apr 24 14:24:39.523194 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.523173 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" Apr 24 14:24:39.528748 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.528727 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7nph4" Apr 24 14:24:39.535258 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.535239 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w4bb5" Apr 24 14:24:39.540849 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.540828 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p6pq2" Apr 24 14:24:39.546486 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.546467 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:24:39.552047 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.552028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:39.798000 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.797967 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c3fa93e_ebcf_4859_a30a_a5dfd8bd28e8.slice/crio-7edc9f250baf6b1682f21c5fb81fa40428047ae14544e59b40b9944fc08d3d40 WatchSource:0}: Error finding container 7edc9f250baf6b1682f21c5fb81fa40428047ae14544e59b40b9944fc08d3d40: Status 404 returned error can't find the container with id 7edc9f250baf6b1682f21c5fb81fa40428047ae14544e59b40b9944fc08d3d40 Apr 24 14:24:39.799723 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.799702 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48c500e5_ff8b_4e0c_bdda_745035b2e024.slice/crio-bf4977c2d131e4525f03daa71a909016166aa4bfce22a129a4d404d42ce322f3 WatchSource:0}: Error finding container bf4977c2d131e4525f03daa71a909016166aa4bfce22a129a4d404d42ce322f3: Status 404 returned error can't find the container with id bf4977c2d131e4525f03daa71a909016166aa4bfce22a129a4d404d42ce322f3 Apr 24 14:24:39.804752 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.804623 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc100dd1e_57a3_471e_998a_d002af692c13.slice/crio-4a4496afc7a9bb46cf745d67f7246b52a1680887a335f3a0c15c1455fa6cb160 WatchSource:0}: Error finding container 4a4496afc7a9bb46cf745d67f7246b52a1680887a335f3a0c15c1455fa6cb160: Status 404 returned error can't find the container with id 4a4496afc7a9bb46cf745d67f7246b52a1680887a335f3a0c15c1455fa6cb160 Apr 24 14:24:39.805571 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.805552 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode33e5322_8c6e_4112_b027_ca7e081d534b.slice/crio-45fc001a1b6319239817657ed4f3483d8f272666ad61089f3be548d231915a5e WatchSource:0}: Error finding container 45fc001a1b6319239817657ed4f3483d8f272666ad61089f3be548d231915a5e: Status 404 returned error can't find the container with id 45fc001a1b6319239817657ed4f3483d8f272666ad61089f3be548d231915a5e Apr 24 14:24:39.806338 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.806314 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66176040_5bda_46d8_aaba_ef37c25ad37e.slice/crio-a8706ca27a2e7a0d82ae5a9297ec30b4a414636014b9e7e9f2b9aab264651a1a WatchSource:0}: Error finding container a8706ca27a2e7a0d82ae5a9297ec30b4a414636014b9e7e9f2b9aab264651a1a: Status 404 returned error can't find the container with id a8706ca27a2e7a0d82ae5a9297ec30b4a414636014b9e7e9f2b9aab264651a1a Apr 24 14:24:39.807290 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.807268 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ce9a076_3b33_4c2e_b35e_6fb3cf4fce67.slice/crio-dc7c09acd752c410d548f07a712b4aed4a1b506cc2c67dbd60e1d8c25b0ba78c WatchSource:0}: Error finding container dc7c09acd752c410d548f07a712b4aed4a1b506cc2c67dbd60e1d8c25b0ba78c: Status 404 returned error can't find the container with id dc7c09acd752c410d548f07a712b4aed4a1b506cc2c67dbd60e1d8c25b0ba78c Apr 24 14:24:39.808301 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.808256 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ace694b_9bcf_445d_8e37_b1371853f469.slice/crio-3a54438cae0f5e3f5132ee2a2a6fab0a8008b47cd11fc5f3ac0d8833c1ec4e1b WatchSource:0}: Error finding container 3a54438cae0f5e3f5132ee2a2a6fab0a8008b47cd11fc5f3ac0d8833c1ec4e1b: Status 404 returned error can't find the container with id 3a54438cae0f5e3f5132ee2a2a6fab0a8008b47cd11fc5f3ac0d8833c1ec4e1b Apr 24 14:24:39.809149 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.809125 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb73f96_473b_43fe_89d5_d7ba4f64faf2.slice/crio-c4c0872fcbffe0a1c47c430a2c754b9d49808adbcab7c416af34d7529eb5a9ed WatchSource:0}: Error finding container c4c0872fcbffe0a1c47c430a2c754b9d49808adbcab7c416af34d7529eb5a9ed: Status 404 returned error can't find the container with id c4c0872fcbffe0a1c47c430a2c754b9d49808adbcab7c416af34d7529eb5a9ed Apr 24 14:24:39.811021 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:24:39.810999 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bac24c0_4d8b_4f25_88b3_6d4cebc649bf.slice/crio-f72de3f5764d667177bda91e6ce2fb616e73355cbc5a76bd515520b7173cb6e9 WatchSource:0}: Error finding container f72de3f5764d667177bda91e6ce2fb616e73355cbc5a76bd515520b7173cb6e9: Status 404 returned error can't find the container with id f72de3f5764d667177bda91e6ce2fb616e73355cbc5a76bd515520b7173cb6e9 Apr 24 14:24:39.816968 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.816944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:39.817093 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.817077 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:39.817166 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.817126 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:40.817111065 +0000 UTC m=+4.075013072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:39.917716 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:39.917684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:39.917890 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.917835 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:39.917890 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.917856 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:39.917890 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.917884 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dg29k for pod openshift-network-diagnostics/network-check-target-rn5ql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:39.918008 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:39.917941 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k podName:442ed584-8835-435d-8b83-97804ed0f554 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:40.917926958 +0000 UTC m=+4.175828961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg29k" (UniqueName: "kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k") pod "network-check-target-rn5ql" (UID: "442ed584-8835-435d-8b83-97804ed0f554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:40.252909 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.252774 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:38 +0000 UTC" deadline="2028-01-13 01:01:36.494385648 +0000 UTC" Apr 24 14:24:40.252909 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.252819 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15082h36m56.241570474s" Apr 24 14:24:40.347252 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.346411 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w4bb5" event={"ID":"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8","Type":"ContainerStarted","Data":"7edc9f250baf6b1682f21c5fb81fa40428047ae14544e59b40b9944fc08d3d40"} Apr 24 14:24:40.360890 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.360837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" event={"ID":"20602652b465b1663ffa2d71aa406f85","Type":"ContainerStarted","Data":"7ae123e05dd02bbdd1b150287960ad654965b8250e7b43d74e73b67e386c6e4e"} Apr 24 14:24:40.371481 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.371446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p6pq2" event={"ID":"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67","Type":"ContainerStarted","Data":"dc7c09acd752c410d548f07a712b4aed4a1b506cc2c67dbd60e1d8c25b0ba78c"} Apr 24 14:24:40.382246 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.382189 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4fxh7" event={"ID":"c100dd1e-57a3-471e-998a-d002af692c13","Type":"ContainerStarted","Data":"4a4496afc7a9bb46cf745d67f7246b52a1680887a335f3a0c15c1455fa6cb160"} Apr 24 14:24:40.386561 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.386520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7nph4" event={"ID":"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf","Type":"ContainerStarted","Data":"f72de3f5764d667177bda91e6ce2fb616e73355cbc5a76bd515520b7173cb6e9"} Apr 24 14:24:40.393536 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.393484 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wt2bk" event={"ID":"9bb73f96-473b-43fe-89d5-d7ba4f64faf2","Type":"ContainerStarted","Data":"c4c0872fcbffe0a1c47c430a2c754b9d49808adbcab7c416af34d7529eb5a9ed"} Apr 24 14:24:40.403475 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.403440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" event={"ID":"8ace694b-9bcf-445d-8e37-b1371853f469","Type":"ContainerStarted","Data":"3a54438cae0f5e3f5132ee2a2a6fab0a8008b47cd11fc5f3ac0d8833c1ec4e1b"} Apr 24 14:24:40.412589 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.412522 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerStarted","Data":"a8706ca27a2e7a0d82ae5a9297ec30b4a414636014b9e7e9f2b9aab264651a1a"} Apr 24 14:24:40.429081 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.429018 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" event={"ID":"e33e5322-8c6e-4112-b027-ca7e081d534b","Type":"ContainerStarted","Data":"45fc001a1b6319239817657ed4f3483d8f272666ad61089f3be548d231915a5e"} Apr 24 14:24:40.439044 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.439006 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"bf4977c2d131e4525f03daa71a909016166aa4bfce22a129a4d404d42ce322f3"} Apr 24 14:24:40.825566 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.825525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:40.825739 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:40.825717 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:40.825853 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:40.825839 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:42.825771935 +0000 UTC m=+6.083673956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:40.926664 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:40.926621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:40.926860 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:40.926842 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:40.926955 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:40.926881 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:40.926955 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:40.926896 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dg29k for pod openshift-network-diagnostics/network-check-target-rn5ql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:40.927061 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:40.926956 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k podName:442ed584-8835-435d-8b83-97804ed0f554 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:42.926936586 +0000 UTC m=+6.184838592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg29k" (UniqueName: "kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k") pod "network-check-target-rn5ql" (UID: "442ed584-8835-435d-8b83-97804ed0f554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:41.328312 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:41.328221 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:41.328745 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:41.328354 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:41.328745 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:41.328469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:41.328745 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:41.328555 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:41.463862 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:41.463824 2577 generic.go:358] "Generic (PLEG): container finished" podID="52bd46895be952cff06d137a21c827f1" containerID="a699285c135fad3943b754c2a347e9dba62de7a29556629f3b9140431c9641ff" exitCode=0 Apr 24 14:24:41.464810 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:41.464733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" event={"ID":"52bd46895be952cff06d137a21c827f1","Type":"ContainerDied","Data":"a699285c135fad3943b754c2a347e9dba62de7a29556629f3b9140431c9641ff"} Apr 24 14:24:41.487512 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:41.487115 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-82.ec2.internal" podStartSLOduration=3.487096695 podStartE2EDuration="3.487096695s" podCreationTimestamp="2026-04-24 14:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:40.375694711 +0000 UTC m=+3.633596727" watchObservedRunningTime="2026-04-24 14:24:41.487096695 +0000 UTC m=+4.744998721" Apr 24 14:24:42.471939 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:42.471333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" event={"ID":"52bd46895be952cff06d137a21c827f1","Type":"ContainerStarted","Data":"145b0340c6b175515abf0277cd3f095f41d5362ae8a42e07298a3075c55f4f83"} Apr 24 14:24:42.843485 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:42.843393 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:42.843646 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:42.843552 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:42.843716 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:42.843648 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:46.84362761 +0000 UTC m=+10.101529624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:42.945409 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:42.944777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:42.945409 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:42.944974 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:42.945409 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:42.944994 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:42.945409 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:42.945008 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dg29k for pod openshift-network-diagnostics/network-check-target-rn5ql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:42.945409 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:42.945068 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k podName:442ed584-8835-435d-8b83-97804ed0f554 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:46.945049215 +0000 UTC m=+10.202951235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg29k" (UniqueName: "kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k") pod "network-check-target-rn5ql" (UID: "442ed584-8835-435d-8b83-97804ed0f554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:43.325992 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:43.325934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:43.326195 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:43.326078 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:43.326195 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:43.325940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:43.326313 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:43.326197 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:45.325823 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:45.325760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:45.325823 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:45.325809 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:45.326361 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:45.325904 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:45.326361 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:45.326002 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:46.877743 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:46.877659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:46.878233 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:46.877825 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:46.878233 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:46.877925 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:54.877902674 +0000 UTC m=+18.135804689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:46.979140 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:46.979033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:46.979140 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:46.979060 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:46.979140 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:46.979078 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:46.979140 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:46.979090 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dg29k for pod openshift-network-diagnostics/network-check-target-rn5ql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:46.979140 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:46.979144 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k podName:442ed584-8835-435d-8b83-97804ed0f554 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:54.979126525 +0000 UTC m=+18.237028532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg29k" (UniqueName: "kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k") pod "network-check-target-rn5ql" (UID: "442ed584-8835-435d-8b83-97804ed0f554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:47.327418 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:47.327292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:47.327565 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:47.327431 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:47.327565 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:47.327481 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:47.327685 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:47.327585 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:49.326437 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:49.326402 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:49.326889 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:49.326519 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:49.326889 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:49.326581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:49.326889 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:49.326681 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:51.326032 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:51.325984 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:51.326032 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:51.326020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:51.326560 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:51.326117 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:51.326560 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:51.326286 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:53.325735 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:53.325694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:53.326188 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:53.325737 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:53.326188 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:53.325840 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:53.326188 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:53.325971 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:54.933205 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:54.933164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:54.933662 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:54.933344 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:54.933662 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:54.933435 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:10.933412042 +0000 UTC m=+34.191314054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:55.034464 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:55.034423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:55.034642 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:55.034616 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:55.034698 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:55.034644 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:55.034698 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:55.034658 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dg29k for pod openshift-network-diagnostics/network-check-target-rn5ql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:55.034798 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:55.034723 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k podName:442ed584-8835-435d-8b83-97804ed0f554 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:11.034702929 +0000 UTC m=+34.292604936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg29k" (UniqueName: "kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k") pod "network-check-target-rn5ql" (UID: "442ed584-8835-435d-8b83-97804ed0f554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:55.325592 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:55.325498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:55.325592 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:55.325551 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:55.325822 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:55.325631 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:55.325822 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:55.325764 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:57.326739 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.326393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:57.327503 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.326451 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:57.327503 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:57.326818 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:57.327503 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:57.326918 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:57.497111 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.497091 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:24:57.497362 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.497341 2577 generic.go:358] "Generic (PLEG): container finished" podID="48c500e5-ff8b-4e0c-bdda-745035b2e024" containerID="fb18f39e8f30ccf73e60cab2128be8338bfbbea920b45fdefac79fd631d10d5d" exitCode=1 Apr 24 14:24:57.497421 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.497410 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"2ee63e756d812b0dad2c1f94b58df91c1c4e3d398f494563475839e1e26ce81b"} Apr 24 14:24:57.497469 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.497433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"9c1602c93c17aba66ac8a39abc136071463eee671791500e0a8716d48501aa1c"} Apr 24 14:24:57.497469 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.497443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"9b5fd67b0900ca61e05681835acebab7a4cc23e8aa966149d5e989c5503dea43"} Apr 24 14:24:57.497469 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.497453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerDied","Data":"fb18f39e8f30ccf73e60cab2128be8338bfbbea920b45fdefac79fd631d10d5d"} Apr 24 14:24:57.497569 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.497468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"e7f1703b1cac8219d8bc825763653285c1bae143bd4f85e7b2ef103266e92a45"} Apr 24 14:24:57.498511 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.498478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w4bb5" event={"ID":"5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8","Type":"ContainerStarted","Data":"0eec0b1e1ff65d51d2fa50c9dbc6d8512d779e7401532bb5ab551bdee6efd6ad"} Apr 24 14:24:57.499602 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.499574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p6pq2" event={"ID":"0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67","Type":"ContainerStarted","Data":"fdee260279b300183cb25cb570e37be7ea2443be776ffb87fc82e59a492dcf78"} Apr 24 14:24:57.500713 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.500691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7nph4" event={"ID":"9bac24c0-4d8b-4f25-88b3-6d4cebc649bf","Type":"ContainerStarted","Data":"d10bca546d6162ddec17d2ba5a82310170caa90b8132ccef4b1dda7c0c67eb2e"} Apr 24 14:24:57.502001 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.501977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wt2bk" event={"ID":"9bb73f96-473b-43fe-89d5-d7ba4f64faf2","Type":"ContainerStarted","Data":"edd630736662554725e2a44f3187b15d0ca2ec619528f977344185dcd6624a3b"} Apr 24 14:24:57.503365 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.503336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" event={"ID":"8ace694b-9bcf-445d-8e37-b1371853f469","Type":"ContainerStarted","Data":"679b3a6a964c4d8856591bec128639bb2383fca69eb0ed5bff0a7b95afc20ef8"} Apr 24 14:24:57.504635 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.504614 2577 generic.go:358] "Generic (PLEG): container finished" podID="66176040-5bda-46d8-aaba-ef37c25ad37e" containerID="fb7e6d9208078ccd1254e08679e83ed73ad9540dde807175fa6e017ddf917acc" exitCode=0 Apr 24 14:24:57.504716 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.504675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerDied","Data":"fb7e6d9208078ccd1254e08679e83ed73ad9540dde807175fa6e017ddf917acc"} Apr 24 14:24:57.506020 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.505856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" event={"ID":"e33e5322-8c6e-4112-b027-ca7e081d534b","Type":"ContainerStarted","Data":"5673d8d2f8513370ba8206ebb174897fca175b532f475d6a17558d9f17a4d9d2"} Apr 24 14:24:57.515680 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.515634 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w4bb5" podStartSLOduration=3.784005019 podStartE2EDuration="20.515621148s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.800915284 +0000 UTC m=+3.058817285" lastFinishedPulling="2026-04-24 14:24:56.532531398 +0000 UTC m=+19.790433414" observedRunningTime="2026-04-24 14:24:57.515566646 +0000 UTC m=+20.773468670" watchObservedRunningTime="2026-04-24 14:24:57.515621148 +0000 UTC m=+20.773523197" Apr 24 14:24:57.515786 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.515734 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-82.ec2.internal" podStartSLOduration=19.515727785 podStartE2EDuration="19.515727785s" podCreationTimestamp="2026-04-24 14:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:42.48729055 +0000 UTC m=+5.745192576" watchObservedRunningTime="2026-04-24 14:24:57.515727785 +0000 UTC m=+20.773629811" Apr 24 14:24:57.553679 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.553394 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7nph4" podStartSLOduration=3.83989575 podStartE2EDuration="20.55337658s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.813024577 +0000 UTC m=+3.070926586" lastFinishedPulling="2026-04-24 14:24:56.526505403 +0000 UTC m=+19.784407416" observedRunningTime="2026-04-24 14:24:57.553206156 +0000 UTC m=+20.811108181" watchObservedRunningTime="2026-04-24 14:24:57.55337658 +0000 UTC m=+20.811278605" Apr 24 14:24:57.577060 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.576812 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p6pq2" podStartSLOduration=3.846004647 podStartE2EDuration="20.576793453s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.81007003 +0000 UTC m=+3.067972031" lastFinishedPulling="2026-04-24 14:24:56.540858834 +0000 UTC m=+19.798760837" observedRunningTime="2026-04-24 14:24:57.576374373 +0000 UTC m=+20.834276398" watchObservedRunningTime="2026-04-24 14:24:57.576793453 +0000 UTC m=+20.834695479" Apr 24 14:24:57.597999 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.597944 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h7rhc" podStartSLOduration=3.903377432 podStartE2EDuration="20.597925874s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.810430273 +0000 UTC m=+3.068332277" lastFinishedPulling="2026-04-24 14:24:56.504978704 +0000 UTC m=+19.762880719" observedRunningTime="2026-04-24 14:24:57.597779604 +0000 UTC m=+20.855681628" watchObservedRunningTime="2026-04-24 14:24:57.597925874 +0000 UTC m=+20.855827896" Apr 24 14:24:57.613751 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.613706 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wt2bk" podStartSLOduration=3.91946232 podStartE2EDuration="20.613693286s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.810906944 +0000 UTC m=+3.068808955" lastFinishedPulling="2026-04-24 14:24:56.505137913 +0000 UTC m=+19.763039921" observedRunningTime="2026-04-24 14:24:57.613352548 +0000 UTC m=+20.871254582" watchObservedRunningTime="2026-04-24 14:24:57.613693286 +0000 UTC m=+20.871595311" Apr 24 14:24:57.690768 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:57.690747 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:58.266339 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.266205 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:57.690764392Z","UUID":"79f5ecca-6ddd-4190-9037-e10794f46013","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:58.269827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.269799 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:58.269827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.269832 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:58.509848 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.509787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4fxh7" event={"ID":"c100dd1e-57a3-471e-998a-d002af692c13","Type":"ContainerStarted","Data":"5b2bf8e5e6a7f00b0f2a5071e8f5a7c6c1637350b603d591999c5e0482fc63d1"} Apr 24 14:24:58.511852 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.511814 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" event={"ID":"e33e5322-8c6e-4112-b027-ca7e081d534b","Type":"ContainerStarted","Data":"dccea80ff5e33b6a59b1e62ff0de4b0ffcb03d841a4d5a9f03909ed00d246800"} Apr 24 14:24:58.514570 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.514534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:24:58.514930 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.514908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"ffe5e8c66caa9dccda2531f65e50e9835409a4324f319463af6602961a51637f"} Apr 24 14:24:58.525029 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.524972 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4fxh7" podStartSLOduration=4.804834254 podStartE2EDuration="21.524957955s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.80638313 +0000 UTC m=+3.064285135" lastFinishedPulling="2026-04-24 14:24:56.526506823 +0000 UTC m=+19.784408836" observedRunningTime="2026-04-24 14:24:58.524729194 +0000 UTC m=+21.782631218" watchObservedRunningTime="2026-04-24 14:24:58.524957955 +0000 UTC m=+21.782860173" Apr 24 14:24:58.529926 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.529900 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:58.530690 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:58.530666 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:59.325999 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:59.325965 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:24:59.325999 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:59.325984 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:24:59.326369 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:59.326081 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:24:59.326369 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:24:59.326232 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:24:59.531163 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:59.531129 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" event={"ID":"e33e5322-8c6e-4112-b027-ca7e081d534b","Type":"ContainerStarted","Data":"b835824b7fb190433415848bd6e1a1695050757f2e69d0dc8978a5bf250987c8"} Apr 24 14:24:59.531941 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:59.531515 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:59.532051 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:59.532033 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wt2bk" Apr 24 14:24:59.554023 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:24:59.553978 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lv7hd" podStartSLOduration=3.524139364 podStartE2EDuration="22.553964798s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.807646049 +0000 UTC m=+3.065548053" lastFinishedPulling="2026-04-24 14:24:58.837471483 +0000 UTC m=+22.095373487" observedRunningTime="2026-04-24 14:24:59.552701527 +0000 UTC m=+22.810603565" watchObservedRunningTime="2026-04-24 14:24:59.553964798 +0000 UTC m=+22.811866821" Apr 24 14:25:00.538543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:00.538515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:25:00.539119 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:00.538936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"aa3e6655bf1a2f61bbaa40781430cc4ca211e9af18bf0dc8bf82d52c87aec427"} Apr 24 14:25:01.326207 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:01.326167 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:01.326207 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:01.326201 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:01.326802 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:01.326718 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:25:01.326952 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:01.326801 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:25:02.544226 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.544021 2577 generic.go:358] "Generic (PLEG): container finished" podID="66176040-5bda-46d8-aaba-ef37c25ad37e" containerID="853a974e07692041a41c65faf7c44aa5a25db586b9e5cfef797cc8bf33e065a2" exitCode=0 Apr 24 14:25:02.544760 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.544098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerDied","Data":"853a974e07692041a41c65faf7c44aa5a25db586b9e5cfef797cc8bf33e065a2"} Apr 24 14:25:02.547369 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.547355 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:25:02.547678 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.547655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"0eb6772e94b0605c299b05310c7222fc109fa75ebc956744508e645715c60af2"} Apr 24 14:25:02.547907 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.547885 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:25:02.548016 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.547915 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:25:02.548213 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.548198 2577 scope.go:117] "RemoveContainer" containerID="fb18f39e8f30ccf73e60cab2128be8338bfbbea920b45fdefac79fd631d10d5d" Apr 24 14:25:02.563156 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:02.563134 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:25:03.325942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.325913 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:03.326109 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:03.326034 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:25:03.326109 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.326085 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:03.326238 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:03.326213 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:25:03.561942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.561671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:25:03.562498 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.562301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" event={"ID":"48c500e5-ff8b-4e0c-bdda-745035b2e024","Type":"ContainerStarted","Data":"6e73afd314e1bd669e99b69883f1c5e94b6cb2952371ea05d12cbbcd69890435"} Apr 24 14:25:03.562861 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.562837 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:25:03.582035 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.581950 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:25:03.594792 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.594670 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" podStartSLOduration=9.610067949 podStartE2EDuration="26.594650271s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.801906764 +0000 UTC m=+3.059808766" lastFinishedPulling="2026-04-24 14:24:56.786489073 +0000 UTC m=+20.044391088" observedRunningTime="2026-04-24 14:25:03.591849863 +0000 UTC m=+26.849751887" watchObservedRunningTime="2026-04-24 14:25:03.594650271 +0000 UTC m=+26.852552297" Apr 24 14:25:03.811857 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.811808 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rn5ql"] Apr 24 14:25:03.812045 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.811982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:03.812124 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:03.812089 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:25:03.812577 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.812553 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-49nxb"] Apr 24 14:25:03.812677 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:03.812664 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:03.812808 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:03.812784 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:25:04.566202 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:04.566164 2577 generic.go:358] "Generic (PLEG): container finished" podID="66176040-5bda-46d8-aaba-ef37c25ad37e" containerID="671f0818dfc9fb1b919857600f1bf3955f71f6e67e109993f55366306f841949" exitCode=0 Apr 24 14:25:04.566836 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:04.566250 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerDied","Data":"671f0818dfc9fb1b919857600f1bf3955f71f6e67e109993f55366306f841949"} Apr 24 14:25:05.326040 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:05.326001 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:05.326193 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:05.326084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:05.326270 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:05.326230 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:25:05.326346 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:05.326324 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:25:06.572115 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:06.572079 2577 generic.go:358] "Generic (PLEG): container finished" podID="66176040-5bda-46d8-aaba-ef37c25ad37e" containerID="fa42abf84495bda2fd3566fbe7fac1189f132bb05a0858ad367f9a2aac60fff7" exitCode=0 Apr 24 14:25:06.572614 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:06.572141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerDied","Data":"fa42abf84495bda2fd3566fbe7fac1189f132bb05a0858ad367f9a2aac60fff7"} Apr 24 14:25:07.326722 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:07.326687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:07.326910 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:07.326795 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:25:07.326910 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:07.326848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:07.327021 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:07.326981 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:25:09.326696 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.326430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:09.327162 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:09.326797 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn5ql" podUID="442ed584-8835-435d-8b83-97804ed0f554" Apr 24 14:25:09.327162 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.326536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:09.327162 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:09.326999 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:25:09.602847 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.602820 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-82.ec2.internal" event="NodeReady" Apr 24 14:25:09.603031 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.602991 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:25:09.656059 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.656025 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wp7sh"] Apr 24 14:25:09.684927 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.684894 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vsblc"] Apr 24 14:25:09.685112 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.685039 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.687496 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.687465 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:25:09.687647 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.687467 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bxm2\"" Apr 24 14:25:09.687647 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.687510 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:25:09.700707 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.700664 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wp7sh"] Apr 24 14:25:09.700707 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.700700 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vsblc"] Apr 24 14:25:09.700884 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.700786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:09.703456 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.703412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:25:09.703456 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.703424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfc9t\"" Apr 24 14:25:09.703456 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.703424 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:25:09.703684 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.703443 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:25:09.844254 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.844219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.844254 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.844255 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9a9c7a0-840a-4811-89fd-c85ae43af97f-tmp-dir\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.844464 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.844276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v8vz\" (UniqueName: \"kubernetes.io/projected/a9a9c7a0-840a-4811-89fd-c85ae43af97f-kube-api-access-6v8vz\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.844464 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.844406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:09.844464 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.844436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9a9c7a0-840a-4811-89fd-c85ae43af97f-config-volume\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.844464 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.844455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkd9l\" (UniqueName: \"kubernetes.io/projected/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-kube-api-access-bkd9l\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:09.945609 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.945519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:09.945609 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.945572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9a9c7a0-840a-4811-89fd-c85ae43af97f-config-volume\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.945609 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.945596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkd9l\" (UniqueName: \"kubernetes.io/projected/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-kube-api-access-bkd9l\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:09.945960 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.945638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.945960 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.945660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9a9c7a0-840a-4811-89fd-c85ae43af97f-tmp-dir\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.945960 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:09.945634 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:09.945960 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.945692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6v8vz\" (UniqueName: \"kubernetes.io/projected/a9a9c7a0-840a-4811-89fd-c85ae43af97f-kube-api-access-6v8vz\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.945960 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:09.945741 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:09.945960 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:09.945751 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:25:10.445729182 +0000 UTC m=+33.703631189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:25:09.945960 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:09.945801 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:25:10.445782518 +0000 UTC m=+33.703684524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:25:09.946317 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.946089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9a9c7a0-840a-4811-89fd-c85ae43af97f-tmp-dir\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.946317 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.946238 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9a9c7a0-840a-4811-89fd-c85ae43af97f-config-volume\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.957537 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.957506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v8vz\" (UniqueName: \"kubernetes.io/projected/a9a9c7a0-840a-4811-89fd-c85ae43af97f-kube-api-access-6v8vz\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:09.957680 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:09.957591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkd9l\" (UniqueName: \"kubernetes.io/projected/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-kube-api-access-bkd9l\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:10.449844 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:10.449801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:10.450337 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:10.449996 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:10.450337 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:10.450026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:10.450337 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:10.450086 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:25:11.450064345 +0000 UTC m=+34.707966359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:25:10.450337 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:10.450135 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:10.450337 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:10.450206 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:25:11.450188904 +0000 UTC m=+34.708090922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:25:10.953657 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:10.953612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:10.953958 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:10.953796 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:25:10.953958 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:10.953899 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:42.953858842 +0000 UTC m=+66.211760872 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:25:11.054539 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.054500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:11.054738 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.054691 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:25:11.054738 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.054718 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:25:11.054738 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.054733 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dg29k for pod openshift-network-diagnostics/network-check-target-rn5ql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:25:11.054851 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.054810 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k podName:442ed584-8835-435d-8b83-97804ed0f554 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:43.054791302 +0000 UTC m=+66.312693323 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg29k" (UniqueName: "kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k") pod "network-check-target-rn5ql" (UID: "442ed584-8835-435d-8b83-97804ed0f554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:25:11.325594 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.325512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:11.325594 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.325558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:11.329580 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.329549 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-crqvq\"" Apr 24 14:25:11.329707 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.329610 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:25:11.329707 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.329617 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qh9cm\"" Apr 24 14:25:11.329707 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.329556 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:25:11.329863 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.329561 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:25:11.457692 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.457649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:11.458190 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:11.457710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:11.458190 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.457881 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:11.458190 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.457964 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:25:13.457944385 +0000 UTC m=+36.715846391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:25:11.458190 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.457881 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:11.458190 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:11.458054 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:25:13.458033259 +0000 UTC m=+36.715935265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:25:13.231594 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.231560 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf"] Apr 24 14:25:13.242889 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.242837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.243060 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.242967 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf"] Apr 24 14:25:13.245283 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.245254 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 14:25:13.245283 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.245254 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 14:25:13.245556 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.245542 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 14:25:13.246551 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.246535 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 14:25:13.371327 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.371295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/045bf3e9-8395-41bc-b778-6a5921eb1095-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.371502 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.371353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/045bf3e9-8395-41bc-b778-6a5921eb1095-tmp\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.371502 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.371421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wnhv\" (UniqueName: \"kubernetes.io/projected/045bf3e9-8395-41bc-b778-6a5921eb1095-kube-api-access-9wnhv\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.472042 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.472003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wnhv\" (UniqueName: \"kubernetes.io/projected/045bf3e9-8395-41bc-b778-6a5921eb1095-kube-api-access-9wnhv\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.472195 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.472062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:13.472195 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.472109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:13.472195 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.472140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/045bf3e9-8395-41bc-b778-6a5921eb1095-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.472283 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.472214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/045bf3e9-8395-41bc-b778-6a5921eb1095-tmp\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.472283 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:13.472237 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:13.472345 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:13.472294 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:25:17.472276081 +0000 UTC m=+40.730178088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:25:13.472401 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:13.472356 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:13.472401 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:13.472397 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:25:17.472384173 +0000 UTC m=+40.730286178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:25:13.472617 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.472584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/045bf3e9-8395-41bc-b778-6a5921eb1095-tmp\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.475278 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.475258 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/045bf3e9-8395-41bc-b778-6a5921eb1095-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.480838 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.480815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wnhv\" (UniqueName: \"kubernetes.io/projected/045bf3e9-8395-41bc-b778-6a5921eb1095-kube-api-access-9wnhv\") pod \"klusterlet-addon-workmgr-5b768cd975-x56qf\" (UID: \"045bf3e9-8395-41bc-b778-6a5921eb1095\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.551357 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.551275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:13.588953 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.588920 2577 generic.go:358] "Generic (PLEG): container finished" podID="66176040-5bda-46d8-aaba-ef37c25ad37e" containerID="1ff296106c4d06cf178975e54e87e290f1fa88c448fa1d21e78ee35810ca6d63" exitCode=0 Apr 24 14:25:13.589095 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.588992 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerDied","Data":"1ff296106c4d06cf178975e54e87e290f1fa88c448fa1d21e78ee35810ca6d63"} Apr 24 14:25:13.729948 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:13.729782 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf"] Apr 24 14:25:13.733198 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:25:13.733172 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045bf3e9_8395_41bc_b778_6a5921eb1095.slice/crio-5e4a4d871a8c264242754906279c43698c5ca4e0e4187b89a8d1a283baaba25a WatchSource:0}: Error finding container 5e4a4d871a8c264242754906279c43698c5ca4e0e4187b89a8d1a283baaba25a: Status 404 returned error can't find the container with id 5e4a4d871a8c264242754906279c43698c5ca4e0e4187b89a8d1a283baaba25a Apr 24 14:25:14.595630 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:14.595589 2577 generic.go:358] "Generic (PLEG): container finished" podID="66176040-5bda-46d8-aaba-ef37c25ad37e" containerID="b990d4eea35185137609594ccb06c6a1aa598048a6c97500a28ad2423a81dab2" exitCode=0 Apr 24 14:25:14.596242 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:14.595785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerDied","Data":"b990d4eea35185137609594ccb06c6a1aa598048a6c97500a28ad2423a81dab2"} Apr 24 14:25:14.597406 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:14.597372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" event={"ID":"045bf3e9-8395-41bc-b778-6a5921eb1095","Type":"ContainerStarted","Data":"5e4a4d871a8c264242754906279c43698c5ca4e0e4187b89a8d1a283baaba25a"} Apr 24 14:25:15.603286 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:15.603240 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kplkp" event={"ID":"66176040-5bda-46d8-aaba-ef37c25ad37e","Type":"ContainerStarted","Data":"9f3c93cd7f30e7dc900999e1f61130128d10458de6f46eb5e32d632ee7a9226b"} Apr 24 14:25:15.629699 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:15.629645 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kplkp" podStartSLOduration=6.00715433 podStartE2EDuration="38.6296257s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.808200391 +0000 UTC m=+3.066102394" lastFinishedPulling="2026-04-24 14:25:12.430671759 +0000 UTC m=+35.688573764" observedRunningTime="2026-04-24 14:25:15.628260987 +0000 UTC m=+38.886163012" watchObservedRunningTime="2026-04-24 14:25:15.6296257 +0000 UTC m=+38.887527727" Apr 24 14:25:17.501530 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:17.501475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:17.501992 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:17.501627 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:17.501992 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:17.501633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:17.501992 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:17.501695 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:25:25.501679941 +0000 UTC m=+48.759581943 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:25:17.501992 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:17.501722 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:17.501992 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:17.501773 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:25:25.501757555 +0000 UTC m=+48.759659559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:25:17.609014 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:17.608926 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" event={"ID":"045bf3e9-8395-41bc-b778-6a5921eb1095","Type":"ContainerStarted","Data":"dbae6b6c5428581e79cce37bcc699e2ccedc099ca4ca2ca63eb0c11652f723b0"} Apr 24 14:25:17.609237 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:17.609218 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:17.610851 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:17.610827 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:25:17.625466 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:17.625423 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" podStartSLOduration=1.006733927 podStartE2EDuration="4.625411381s" podCreationTimestamp="2026-04-24 14:25:13 +0000 UTC" firstStartedPulling="2026-04-24 14:25:13.735076581 +0000 UTC m=+36.992978583" lastFinishedPulling="2026-04-24 14:25:17.353754035 +0000 UTC m=+40.611656037" observedRunningTime="2026-04-24 14:25:17.62455501 +0000 UTC m=+40.882457058" watchObservedRunningTime="2026-04-24 14:25:17.625411381 +0000 UTC m=+40.883313405" Apr 24 14:25:25.564121 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:25.564075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:25.564121 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:25.564138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:25.564638 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:25.564243 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:25.564638 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:25.564306 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:25:41.564290135 +0000 UTC m=+64.822192136 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:25:25.564638 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:25.564243 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:25.564638 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:25.564360 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:25:41.564349087 +0000 UTC m=+64.822251092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:25:35.586147 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:35.586109 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdc2t" Apr 24 14:25:41.579840 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:41.579787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:25:41.580263 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:41.579909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:25:41.580263 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:41.580009 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:41.580263 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:41.580076 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:26:13.580061646 +0000 UTC m=+96.837963648 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:25:41.580263 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:41.580009 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:41.580263 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:41.580140 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:13.580129376 +0000 UTC m=+96.838031384 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:25:42.991504 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:42.991454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:25:42.994476 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:42.994442 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:25:43.001632 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:43.001609 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:25:43.001727 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:25:43.001670 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:47.001655031 +0000 UTC m=+130.259557033 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : secret "metrics-daemon-secret" not found Apr 24 14:25:43.092115 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.092073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:43.095141 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.095118 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:25:43.104559 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.104537 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:25:43.115720 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.115694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg29k\" (UniqueName: \"kubernetes.io/projected/442ed584-8835-435d-8b83-97804ed0f554-kube-api-access-dg29k\") pod \"network-check-target-rn5ql\" (UID: \"442ed584-8835-435d-8b83-97804ed0f554\") " pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:43.140744 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.140709 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qh9cm\"" Apr 24 14:25:43.149025 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.148999 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:43.262508 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.262435 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rn5ql"] Apr 24 14:25:43.265695 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:25:43.265669 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod442ed584_8835_435d_8b83_97804ed0f554.slice/crio-193c5eddf9a786521feeeded6579fbf2af938ce26c25b6355d8d4261eaa0468a WatchSource:0}: Error finding container 193c5eddf9a786521feeeded6579fbf2af938ce26c25b6355d8d4261eaa0468a: Status 404 returned error can't find the container with id 193c5eddf9a786521feeeded6579fbf2af938ce26c25b6355d8d4261eaa0468a Apr 24 14:25:43.656559 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:43.656515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rn5ql" event={"ID":"442ed584-8835-435d-8b83-97804ed0f554","Type":"ContainerStarted","Data":"193c5eddf9a786521feeeded6579fbf2af938ce26c25b6355d8d4261eaa0468a"} Apr 24 14:25:46.663659 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:46.663522 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rn5ql" event={"ID":"442ed584-8835-435d-8b83-97804ed0f554","Type":"ContainerStarted","Data":"22b0f6274f4af3db8ca9e9731def84797a82e7723da7e96647d9fd615bfa4b79"} Apr 24 14:25:46.664117 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:46.663665 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:25:46.680520 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:25:46.680472 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rn5ql" podStartSLOduration=67.136236444 podStartE2EDuration="1m9.680459115s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:25:43.267608446 +0000 UTC m=+66.525510447" lastFinishedPulling="2026-04-24 14:25:45.811831112 +0000 UTC m=+69.069733118" observedRunningTime="2026-04-24 14:25:46.67991156 +0000 UTC m=+69.937813585" watchObservedRunningTime="2026-04-24 14:25:46.680459115 +0000 UTC m=+69.938361116" Apr 24 14:26:13.602585 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:13.602552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:26:13.602996 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:13.602612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:26:13.602996 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:13.602705 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:26:13.602996 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:13.602744 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:26:13.602996 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:13.602777 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls podName:a9a9c7a0-840a-4811-89fd-c85ae43af97f nodeName:}" failed. No retries permitted until 2026-04-24 14:27:17.602760772 +0000 UTC m=+160.860662774 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls") pod "dns-default-wp7sh" (UID: "a9a9c7a0-840a-4811-89fd-c85ae43af97f") : secret "dns-default-metrics-tls" not found Apr 24 14:26:13.602996 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:13.602797 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert podName:ce9c2869-28ce-47d7-a2d8-19e09fc982ba nodeName:}" failed. No retries permitted until 2026-04-24 14:27:17.602784205 +0000 UTC m=+160.860686206 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert") pod "ingress-canary-vsblc" (UID: "ce9c2869-28ce-47d7-a2d8-19e09fc982ba") : secret "canary-serving-cert" not found Apr 24 14:26:17.668206 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:17.668175 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rn5ql" Apr 24 14:26:35.351029 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.350991 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp"] Apr 24 14:26:35.356184 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.356166 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:35.359137 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.359111 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.360572 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.360545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-z4pdj\"" Apr 24 14:26:35.360849 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.360665 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 14:26:35.361142 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.360838 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.365412 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.365390 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp"] Apr 24 14:26:35.450275 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.450242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:35.450455 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.450301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4c2\" (UniqueName: \"kubernetes.io/projected/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-kube-api-access-nz4c2\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:35.459883 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.459843 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr"] Apr 24 14:26:35.463370 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.463350 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf"] Apr 24 14:26:35.463517 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.463499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.466139 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.466118 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 14:26:35.466239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.466180 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.466326 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.466313 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.466389 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.466335 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 14:26:35.466444 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.466424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-bhr7q\"" Apr 24 14:26:35.466632 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.466616 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bbcccc89f-jszxr"] Apr 24 14:26:35.466789 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.466773 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.469337 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.469314 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.469440 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.469382 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 14:26:35.469440 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.469395 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 14:26:35.469440 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.469404 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.469669 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.469655 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zz4mv\"" Apr 24 14:26:35.469728 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.469685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.472945 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.472925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 14:26:35.473051 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.472932 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 14:26:35.473111 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.473079 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr"] Apr 24 14:26:35.473283 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.473259 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 14:26:35.473483 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.473467 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpm4v\"" Apr 24 14:26:35.474017 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.473995 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf"] Apr 24 14:26:35.479739 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.479717 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 14:26:35.480702 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.480681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bbcccc89f-jszxr"] Apr 24 14:26:35.551360 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:35.551360 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ww5b\" (UniqueName: \"kubernetes.io/projected/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-kube-api-access-7ww5b\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03102919-727f-42a2-8035-d799d73184d6-config\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad2ecded-9941-4d85-be6f-1a36e7e4229a-ca-trust-extracted\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-trusted-ca\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-installation-pull-secrets\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:35.551485 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551526 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03102919-727f-42a2-8035-d799d73184d6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:35.551562 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls podName:532e4d1a-0b12-40ee-8f73-cf439eeb8d0d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:36.051538799 +0000 UTC m=+119.309440804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-24knp" (UID: "532e4d1a-0b12-40ee-8f73-cf439eeb8d0d") : secret "samples-operator-tls" not found Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551586 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dx9\" (UniqueName: \"kubernetes.io/projected/03102919-727f-42a2-8035-d799d73184d6-kube-api-access-h2dx9\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.551629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.552071 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.552071 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-certificates\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.552071 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-bound-sa-token\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.552071 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4c2\" (UniqueName: \"kubernetes.io/projected/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-kube-api-access-nz4c2\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:35.552071 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58dj\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-kube-api-access-z58dj\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.552071 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.552071 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.551852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-image-registry-private-configuration\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.561128 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.561099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4c2\" (UniqueName: \"kubernetes.io/projected/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-kube-api-access-nz4c2\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:35.652361 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.652361 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-image-registry-private-configuration\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ww5b\" (UniqueName: \"kubernetes.io/projected/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-kube-api-access-7ww5b\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03102919-727f-42a2-8035-d799d73184d6-config\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad2ecded-9941-4d85-be6f-1a36e7e4229a-ca-trust-extracted\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-trusted-ca\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-installation-pull-secrets\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03102919-727f-42a2-8035-d799d73184d6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2dx9\" (UniqueName: \"kubernetes.io/projected/03102919-727f-42a2-8035-d799d73184d6-kube-api-access-h2dx9\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.652588 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652998 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.652998 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-certificates\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652998 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-bound-sa-token\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652998 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z58dj\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-kube-api-access-z58dj\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.652998 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.652957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad2ecded-9941-4d85-be6f-1a36e7e4229a-ca-trust-extracted\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.653243 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.653027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.653465 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:35.653442 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:35.653465 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:35.653463 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bbcccc89f-jszxr: secret "image-registry-tls" not found Apr 24 14:26:35.653629 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:35.653516 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls podName:ad2ecded-9941-4d85-be6f-1a36e7e4229a nodeName:}" failed. No retries permitted until 2026-04-24 14:26:36.15349782 +0000 UTC m=+119.411399839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls") pod "image-registry-7bbcccc89f-jszxr" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a") : secret "image-registry-tls" not found Apr 24 14:26:35.653629 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.653538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-certificates\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.653737 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.653639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-trusted-ca\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.653910 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.653862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03102919-727f-42a2-8035-d799d73184d6-config\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.655159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.655135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03102919-727f-42a2-8035-d799d73184d6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.655337 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.655314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-image-registry-private-configuration\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.655522 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.655504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.655775 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.655758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-installation-pull-secrets\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.663100 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.663078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ww5b\" (UniqueName: \"kubernetes.io/projected/5bf1887d-7526-48ef-aa33-2cf15cf8ced2-kube-api-access-7ww5b\") pod \"kube-storage-version-migrator-operator-6769c5d45-8fpnr\" (UID: \"5bf1887d-7526-48ef-aa33-2cf15cf8ced2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.663427 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.663408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2dx9\" (UniqueName: \"kubernetes.io/projected/03102919-727f-42a2-8035-d799d73184d6-kube-api-access-h2dx9\") pod \"service-ca-operator-d6fc45fc5-vzfcf\" (UID: \"03102919-727f-42a2-8035-d799d73184d6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.663797 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.663778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58dj\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-kube-api-access-z58dj\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.663919 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.663903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-bound-sa-token\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:35.775174 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.775143 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" Apr 24 14:26:35.781771 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.781743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" Apr 24 14:26:35.896703 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.896671 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr"] Apr 24 14:26:35.899479 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:26:35.899449 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf1887d_7526_48ef_aa33_2cf15cf8ced2.slice/crio-db0e41e2117d75b5366571598541e91eaadb62450833527fdd903dbd4cfadb0d WatchSource:0}: Error finding container db0e41e2117d75b5366571598541e91eaadb62450833527fdd903dbd4cfadb0d: Status 404 returned error can't find the container with id db0e41e2117d75b5366571598541e91eaadb62450833527fdd903dbd4cfadb0d Apr 24 14:26:35.910578 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:35.910555 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf"] Apr 24 14:26:35.913231 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:26:35.913210 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03102919_727f_42a2_8035_d799d73184d6.slice/crio-30d07b9f3f28781eb20b9bcc04a59323d7a64f9475d420ad3ba2533496cff218 WatchSource:0}: Error finding container 30d07b9f3f28781eb20b9bcc04a59323d7a64f9475d420ad3ba2533496cff218: Status 404 returned error can't find the container with id 30d07b9f3f28781eb20b9bcc04a59323d7a64f9475d420ad3ba2533496cff218 Apr 24 14:26:36.056143 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:36.056105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:36.056319 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:36.056240 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:36.056319 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:36.056298 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls podName:532e4d1a-0b12-40ee-8f73-cf439eeb8d0d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:37.056283376 +0000 UTC m=+120.314185378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-24knp" (UID: "532e4d1a-0b12-40ee-8f73-cf439eeb8d0d") : secret "samples-operator-tls" not found Apr 24 14:26:36.157528 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:36.157447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:36.157678 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:36.157554 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:36.157678 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:36.157566 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bbcccc89f-jszxr: secret "image-registry-tls" not found Apr 24 14:26:36.157678 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:36.157618 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls podName:ad2ecded-9941-4d85-be6f-1a36e7e4229a nodeName:}" failed. No retries permitted until 2026-04-24 14:26:37.157604804 +0000 UTC m=+120.415506806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls") pod "image-registry-7bbcccc89f-jszxr" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a") : secret "image-registry-tls" not found Apr 24 14:26:36.763252 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:36.763194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" event={"ID":"03102919-727f-42a2-8035-d799d73184d6","Type":"ContainerStarted","Data":"30d07b9f3f28781eb20b9bcc04a59323d7a64f9475d420ad3ba2533496cff218"} Apr 24 14:26:36.764367 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:36.764332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" event={"ID":"5bf1887d-7526-48ef-aa33-2cf15cf8ced2","Type":"ContainerStarted","Data":"db0e41e2117d75b5366571598541e91eaadb62450833527fdd903dbd4cfadb0d"} Apr 24 14:26:37.064515 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:37.064411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:37.064673 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:37.064565 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:37.064673 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:37.064634 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls podName:532e4d1a-0b12-40ee-8f73-cf439eeb8d0d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:39.064617925 +0000 UTC m=+122.322519926 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-24knp" (UID: "532e4d1a-0b12-40ee-8f73-cf439eeb8d0d") : secret "samples-operator-tls" not found Apr 24 14:26:37.165829 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:37.165789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:37.165985 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:37.165924 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:37.165985 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:37.165936 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bbcccc89f-jszxr: secret "image-registry-tls" not found Apr 24 14:26:37.166051 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:37.165987 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls podName:ad2ecded-9941-4d85-be6f-1a36e7e4229a nodeName:}" failed. No retries permitted until 2026-04-24 14:26:39.165972636 +0000 UTC m=+122.423874638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls") pod "image-registry-7bbcccc89f-jszxr" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a") : secret "image-registry-tls" not found Apr 24 14:26:38.769236 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:38.769192 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" event={"ID":"5bf1887d-7526-48ef-aa33-2cf15cf8ced2","Type":"ContainerStarted","Data":"ba3c31c79ed5c6f054ba7de24bb41bc88cbf82286cc6d2206b6e33db07603cad"} Apr 24 14:26:38.770511 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:38.770480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" event={"ID":"03102919-727f-42a2-8035-d799d73184d6","Type":"ContainerStarted","Data":"1f102b4fc6029a7045ed17d1c404a87be414fdc3cf4109a3cfccc6f5852d115e"} Apr 24 14:26:38.787203 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:38.787152 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" podStartSLOduration=1.461834818 podStartE2EDuration="3.787135334s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:35.901229864 +0000 UTC m=+119.159131865" lastFinishedPulling="2026-04-24 14:26:38.226530379 +0000 UTC m=+121.484432381" observedRunningTime="2026-04-24 14:26:38.785670532 +0000 UTC m=+122.043572577" watchObservedRunningTime="2026-04-24 14:26:38.787135334 +0000 UTC m=+122.045037360" Apr 24 14:26:38.802336 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:38.802282 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" podStartSLOduration=1.493231721 podStartE2EDuration="3.802262937s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:35.914758216 +0000 UTC m=+119.172660218" lastFinishedPulling="2026-04-24 14:26:38.22378943 +0000 UTC m=+121.481691434" observedRunningTime="2026-04-24 14:26:38.801002266 +0000 UTC m=+122.058904302" watchObservedRunningTime="2026-04-24 14:26:38.802262937 +0000 UTC m=+122.060164965" Apr 24 14:26:39.081764 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.081652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:39.081963 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:39.081775 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:39.081963 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:39.081833 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls podName:532e4d1a-0b12-40ee-8f73-cf439eeb8d0d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:43.081818695 +0000 UTC m=+126.339720702 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-24knp" (UID: "532e4d1a-0b12-40ee-8f73-cf439eeb8d0d") : secret "samples-operator-tls" not found Apr 24 14:26:39.182800 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.182756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:39.182990 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:39.182886 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:39.182990 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:39.182899 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bbcccc89f-jszxr: secret "image-registry-tls" not found Apr 24 14:26:39.182990 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:39.182948 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls podName:ad2ecded-9941-4d85-be6f-1a36e7e4229a nodeName:}" failed. No retries permitted until 2026-04-24 14:26:43.182934331 +0000 UTC m=+126.440836332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls") pod "image-registry-7bbcccc89f-jszxr" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a") : secret "image-registry-tls" not found Apr 24 14:26:39.867800 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.867764 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm"] Apr 24 14:26:39.871137 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.871116 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" Apr 24 14:26:39.873952 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.873931 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 14:26:39.874051 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.873931 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 14:26:39.875110 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.875086 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-9xzc7\"" Apr 24 14:26:39.879577 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.879515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm"] Apr 24 14:26:39.990168 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:39.990129 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8nc\" (UniqueName: \"kubernetes.io/projected/7b62e7c5-2bcb-415c-ad16-77c9c85fa204-kube-api-access-wg8nc\") pod \"migrator-74bb7799d9-kxnmm\" (UID: \"7b62e7c5-2bcb-415c-ad16-77c9c85fa204\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" Apr 24 14:26:40.091078 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:40.091045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8nc\" (UniqueName: \"kubernetes.io/projected/7b62e7c5-2bcb-415c-ad16-77c9c85fa204-kube-api-access-wg8nc\") pod \"migrator-74bb7799d9-kxnmm\" (UID: \"7b62e7c5-2bcb-415c-ad16-77c9c85fa204\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" Apr 24 14:26:40.103432 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:40.103402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8nc\" (UniqueName: \"kubernetes.io/projected/7b62e7c5-2bcb-415c-ad16-77c9c85fa204-kube-api-access-wg8nc\") pod \"migrator-74bb7799d9-kxnmm\" (UID: \"7b62e7c5-2bcb-415c-ad16-77c9c85fa204\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" Apr 24 14:26:40.182029 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:40.181945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" Apr 24 14:26:40.297960 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:40.297932 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm"] Apr 24 14:26:40.300805 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:26:40.300775 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b62e7c5_2bcb_415c_ad16_77c9c85fa204.slice/crio-1e892969e8989e50499583291e0819e245e576921a5a9796cf0c1676ad9380f4 WatchSource:0}: Error finding container 1e892969e8989e50499583291e0819e245e576921a5a9796cf0c1676ad9380f4: Status 404 returned error can't find the container with id 1e892969e8989e50499583291e0819e245e576921a5a9796cf0c1676ad9380f4 Apr 24 14:26:40.778418 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:40.778382 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" event={"ID":"7b62e7c5-2bcb-415c-ad16-77c9c85fa204","Type":"ContainerStarted","Data":"1e892969e8989e50499583291e0819e245e576921a5a9796cf0c1676ad9380f4"} Apr 24 14:26:41.781680 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:41.781594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" event={"ID":"7b62e7c5-2bcb-415c-ad16-77c9c85fa204","Type":"ContainerStarted","Data":"606d266534814f64e38de157c03b6f6c38ae892c151bb419a8b4bcce300ea017"} Apr 24 14:26:41.781680 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:41.781634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" event={"ID":"7b62e7c5-2bcb-415c-ad16-77c9c85fa204","Type":"ContainerStarted","Data":"ef19df85323aca91c0ac6472099670a47a34f7bc0847a6d9576299b672aa9774"} Apr 24 14:26:41.797880 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:41.797823 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kxnmm" podStartSLOduration=1.709145293 podStartE2EDuration="2.79780927s" podCreationTimestamp="2026-04-24 14:26:39 +0000 UTC" firstStartedPulling="2026-04-24 14:26:40.303166208 +0000 UTC m=+123.561068209" lastFinishedPulling="2026-04-24 14:26:41.391830181 +0000 UTC m=+124.649732186" observedRunningTime="2026-04-24 14:26:41.796582851 +0000 UTC m=+125.054484876" watchObservedRunningTime="2026-04-24 14:26:41.79780927 +0000 UTC m=+125.055711335" Apr 24 14:26:43.093523 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:43.093495 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w4bb5_5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8/dns-node-resolver/0.log" Apr 24 14:26:43.114259 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:43.114228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:43.114399 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:43.114341 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:43.114399 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:43.114391 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls podName:532e4d1a-0b12-40ee-8f73-cf439eeb8d0d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:51.114378025 +0000 UTC m=+134.372280028 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-24knp" (UID: "532e4d1a-0b12-40ee-8f73-cf439eeb8d0d") : secret "samples-operator-tls" not found Apr 24 14:26:43.215388 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:43.215344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:43.215589 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:43.215525 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:43.215589 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:43.215548 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bbcccc89f-jszxr: secret "image-registry-tls" not found Apr 24 14:26:43.215717 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:43.215619 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls podName:ad2ecded-9941-4d85-be6f-1a36e7e4229a nodeName:}" failed. No retries permitted until 2026-04-24 14:26:51.215597927 +0000 UTC m=+134.473499929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls") pod "image-registry-7bbcccc89f-jszxr" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a") : secret "image-registry-tls" not found Apr 24 14:26:43.890207 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:43.890177 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7nph4_9bac24c0-4d8b-4f25-88b3-6d4cebc649bf/node-ca/0.log" Apr 24 14:26:45.090820 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:45.090791 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kxnmm_7b62e7c5-2bcb-415c-ad16-77c9c85fa204/migrator/0.log" Apr 24 14:26:45.290322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:45.290289 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kxnmm_7b62e7c5-2bcb-415c-ad16-77c9c85fa204/graceful-termination/0.log" Apr 24 14:26:45.491912 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:45.491858 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8fpnr_5bf1887d-7526-48ef-aa33-2cf15cf8ced2/kube-storage-version-migrator-operator/0.log" Apr 24 14:26:47.044382 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:47.044344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:26:47.044771 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:47.044476 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:26:47.044771 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:26:47.044543 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs podName:216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46 nodeName:}" failed. No retries permitted until 2026-04-24 14:28:49.044528095 +0000 UTC m=+252.302430097 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs") pod "network-metrics-daemon-49nxb" (UID: "216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46") : secret "metrics-daemon-secret" not found Apr 24 14:26:51.178747 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.178692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:51.181048 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.181020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/532e4d1a-0b12-40ee-8f73-cf439eeb8d0d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-24knp\" (UID: \"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:51.269727 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.269692 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-z4pdj\"" Apr 24 14:26:51.278014 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.277993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" Apr 24 14:26:51.279821 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.279793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:51.282044 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.282019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"image-registry-7bbcccc89f-jszxr\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:51.390578 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.390552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpm4v\"" Apr 24 14:26:51.393187 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.393156 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp"] Apr 24 14:26:51.398281 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.398256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:51.516277 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.516176 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bbcccc89f-jszxr"] Apr 24 14:26:51.518575 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:26:51.518543 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad2ecded_9941_4d85_be6f_1a36e7e4229a.slice/crio-1082a6bb709af022dbf184ca357ba514b482a66aca60323ce50a2d71346d1498 WatchSource:0}: Error finding container 1082a6bb709af022dbf184ca357ba514b482a66aca60323ce50a2d71346d1498: Status 404 returned error can't find the container with id 1082a6bb709af022dbf184ca357ba514b482a66aca60323ce50a2d71346d1498 Apr 24 14:26:51.809253 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.809158 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" event={"ID":"ad2ecded-9941-4d85-be6f-1a36e7e4229a","Type":"ContainerStarted","Data":"9a5c1f9311c7f33872ac73d7786cda10f1ac422ee588053006b1bba52cec357a"} Apr 24 14:26:51.809253 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.809200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" event={"ID":"ad2ecded-9941-4d85-be6f-1a36e7e4229a","Type":"ContainerStarted","Data":"1082a6bb709af022dbf184ca357ba514b482a66aca60323ce50a2d71346d1498"} Apr 24 14:26:51.809522 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.809254 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:26:51.810306 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.810283 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" event={"ID":"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d","Type":"ContainerStarted","Data":"f6431a80feb6503014950b14cbd92bcf50f7537b4b337f51f1a8db14b79027a3"} Apr 24 14:26:51.829654 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:51.829599 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" podStartSLOduration=16.829583957 podStartE2EDuration="16.829583957s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:51.828281905 +0000 UTC m=+135.086183926" watchObservedRunningTime="2026-04-24 14:26:51.829583957 +0000 UTC m=+135.087485980" Apr 24 14:26:53.817538 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:53.817500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" event={"ID":"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d","Type":"ContainerStarted","Data":"4b1b7d837d2170229200596c7fc733c29367d8334a79b905b81fe5e6546488bc"} Apr 24 14:26:53.817538 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:53.817540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" event={"ID":"532e4d1a-0b12-40ee-8f73-cf439eeb8d0d","Type":"ContainerStarted","Data":"0c11a375ccf781e74b04ea2adb7eb8cbfbc360d7132518941b0ec98789808dbb"} Apr 24 14:26:53.837533 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:26:53.837483 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-24knp" podStartSLOduration=17.214199255 podStartE2EDuration="18.837468645s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:51.462768814 +0000 UTC m=+134.720670816" lastFinishedPulling="2026-04-24 14:26:53.086038193 +0000 UTC m=+136.343940206" observedRunningTime="2026-04-24 14:26:53.835558123 +0000 UTC m=+137.093460168" watchObservedRunningTime="2026-04-24 14:26:53.837468645 +0000 UTC m=+137.095370668" Apr 24 14:27:02.688812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.688776 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xkrr7"] Apr 24 14:27:02.694387 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.694364 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.697064 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.697039 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:27:02.698297 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.698267 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:27:02.698297 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.698296 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:27:02.698467 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.698308 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kg8l8\"" Apr 24 14:27:02.698467 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.698300 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:27:02.700492 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.700473 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xkrr7"] Apr 24 14:27:02.726639 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.726611 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bbcccc89f-jszxr"] Apr 24 14:27:02.763080 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.763049 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b7d4b657c-jctkr"] Apr 24 14:27:02.766633 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.766615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.768554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.768287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.768554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.768329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkqw\" (UniqueName: \"kubernetes.io/projected/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-kube-api-access-tqkqw\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.768554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.768364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-crio-socket\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.768554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.768410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.768554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.768481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-data-volume\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.780215 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.780192 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b7d4b657c-jctkr"] Apr 24 14:27:02.869615 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/349f1bd8-8b03-47cc-8acb-0e815c249834-image-registry-private-configuration\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.869763 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-data-volume\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.869763 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-registry-tls\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.869833 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/349f1bd8-8b03-47cc-8acb-0e815c249834-trusted-ca\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.869833 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nzr5\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-kube-api-access-7nzr5\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.869925 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.869925 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkqw\" (UniqueName: \"kubernetes.io/projected/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-kube-api-access-tqkqw\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.870003 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/349f1bd8-8b03-47cc-8acb-0e815c249834-registry-certificates\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.870003 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.869986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-bound-sa-token\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.870087 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.870017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-data-volume\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.870087 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.870019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/349f1bd8-8b03-47cc-8acb-0e815c249834-installation-pull-secrets\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.870087 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.870072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-crio-socket\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.870239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.870114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.870239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.870136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/349f1bd8-8b03-47cc-8acb-0e815c249834-ca-trust-extracted\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.870239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.870180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-crio-socket\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.870370 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.870352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.872307 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.872290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.889523 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.889493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkqw\" (UniqueName: \"kubernetes.io/projected/ff15be00-d8eb-41e8-a1d4-e126f8a91dc6-kube-api-access-tqkqw\") pod \"insights-runtime-extractor-xkrr7\" (UID: \"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6\") " pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:02.974223 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.974128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/349f1bd8-8b03-47cc-8acb-0e815c249834-registry-certificates\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.974223 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.974201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-bound-sa-token\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.974413 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.974261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/349f1bd8-8b03-47cc-8acb-0e815c249834-installation-pull-secrets\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.974413 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.974339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/349f1bd8-8b03-47cc-8acb-0e815c249834-ca-trust-extracted\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.974499 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.974379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/349f1bd8-8b03-47cc-8acb-0e815c249834-image-registry-private-configuration\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.975010 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.974972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-registry-tls\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.975119 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.975059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/349f1bd8-8b03-47cc-8acb-0e815c249834-trusted-ca\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.975119 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.975088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nzr5\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-kube-api-access-7nzr5\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.975296 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.975271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/349f1bd8-8b03-47cc-8acb-0e815c249834-ca-trust-extracted\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.975495 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.975478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/349f1bd8-8b03-47cc-8acb-0e815c249834-registry-certificates\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.975846 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.975823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/349f1bd8-8b03-47cc-8acb-0e815c249834-trusted-ca\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.977533 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.977512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/349f1bd8-8b03-47cc-8acb-0e815c249834-installation-pull-secrets\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.977635 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.977615 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/349f1bd8-8b03-47cc-8acb-0e815c249834-image-registry-private-configuration\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:02.977880 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:02.977849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-registry-tls\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:03.003986 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.003966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xkrr7" Apr 24 14:27:03.004797 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.004779 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-bound-sa-token\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:03.009136 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.009118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nzr5\" (UniqueName: \"kubernetes.io/projected/349f1bd8-8b03-47cc-8acb-0e815c249834-kube-api-access-7nzr5\") pod \"image-registry-6b7d4b657c-jctkr\" (UID: \"349f1bd8-8b03-47cc-8acb-0e815c249834\") " pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:03.075837 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.075732 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:03.146570 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.146532 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xkrr7"] Apr 24 14:27:03.149949 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:03.149921 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff15be00_d8eb_41e8_a1d4_e126f8a91dc6.slice/crio-cfca0f90ba2e705bfc6b1ae38450cfeb35bb39cbc6f0823d795e2d1a3b0a5858 WatchSource:0}: Error finding container cfca0f90ba2e705bfc6b1ae38450cfeb35bb39cbc6f0823d795e2d1a3b0a5858: Status 404 returned error can't find the container with id cfca0f90ba2e705bfc6b1ae38450cfeb35bb39cbc6f0823d795e2d1a3b0a5858 Apr 24 14:27:03.206043 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.206010 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b7d4b657c-jctkr"] Apr 24 14:27:03.209254 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:03.209230 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod349f1bd8_8b03_47cc_8acb_0e815c249834.slice/crio-577fce12444723c3d32173188d1bc5a818d57983ec36a0107c395d71b73aba0d WatchSource:0}: Error finding container 577fce12444723c3d32173188d1bc5a818d57983ec36a0107c395d71b73aba0d: Status 404 returned error can't find the container with id 577fce12444723c3d32173188d1bc5a818d57983ec36a0107c395d71b73aba0d Apr 24 14:27:03.843164 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.843131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" event={"ID":"349f1bd8-8b03-47cc-8acb-0e815c249834","Type":"ContainerStarted","Data":"2822acb8f91eac51c8f6b469d13a03fe6fe81e6d68effd1387977a9a99ef47cd"} Apr 24 14:27:03.843523 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.843176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" event={"ID":"349f1bd8-8b03-47cc-8acb-0e815c249834","Type":"ContainerStarted","Data":"577fce12444723c3d32173188d1bc5a818d57983ec36a0107c395d71b73aba0d"} Apr 24 14:27:03.843523 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.843219 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:03.844589 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.844564 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrr7" event={"ID":"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6","Type":"ContainerStarted","Data":"6292bba5762e578c5aa2b92a2df5e0afe4deef322db3154ffe4a1fef54514a50"} Apr 24 14:27:03.844667 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.844596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrr7" event={"ID":"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6","Type":"ContainerStarted","Data":"8d591ec006780fdf769b22873479d5cbda68d4b52b98a008231ed80755027990"} Apr 24 14:27:03.844667 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.844609 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrr7" event={"ID":"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6","Type":"ContainerStarted","Data":"cfca0f90ba2e705bfc6b1ae38450cfeb35bb39cbc6f0823d795e2d1a3b0a5858"} Apr 24 14:27:03.866160 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:03.866123 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" podStartSLOduration=1.866110007 podStartE2EDuration="1.866110007s" podCreationTimestamp="2026-04-24 14:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:27:03.86499734 +0000 UTC m=+147.122899364" watchObservedRunningTime="2026-04-24 14:27:03.866110007 +0000 UTC m=+147.124012008" Apr 24 14:27:05.850517 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:05.850483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrr7" event={"ID":"ff15be00-d8eb-41e8-a1d4-e126f8a91dc6","Type":"ContainerStarted","Data":"329eb2c94b442afe3ea52b6313e9640a0c470d35347e940e17fdc9d373d30977"} Apr 24 14:27:05.870606 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:05.870559 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xkrr7" podStartSLOduration=1.91863563 podStartE2EDuration="3.870547375s" podCreationTimestamp="2026-04-24 14:27:02 +0000 UTC" firstStartedPulling="2026-04-24 14:27:03.205527126 +0000 UTC m=+146.463429143" lastFinishedPulling="2026-04-24 14:27:05.157438878 +0000 UTC m=+148.415340888" observedRunningTime="2026-04-24 14:27:05.86901977 +0000 UTC m=+149.126921810" watchObservedRunningTime="2026-04-24 14:27:05.870547375 +0000 UTC m=+149.128449399" Apr 24 14:27:12.696495 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:27:12.696453 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wp7sh" podUID="a9a9c7a0-840a-4811-89fd-c85ae43af97f" Apr 24 14:27:12.710602 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:27:12.710560 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vsblc" podUID="ce9c2869-28ce-47d7-a2d8-19e09fc982ba" Apr 24 14:27:12.733485 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:12.733448 2577 patch_prober.go:28] interesting pod/image-registry-7bbcccc89f-jszxr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 14:27:12.733637 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:12.733529 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" podUID="ad2ecded-9941-4d85-be6f-1a36e7e4229a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:27:12.867190 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:12.867160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wp7sh" Apr 24 14:27:12.867354 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:12.867160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:27:13.293535 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.293498 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-26wsd"] Apr 24 14:27:13.298418 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.298397 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.301130 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.301112 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5dcsr\"" Apr 24 14:27:13.301232 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.301180 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:27:13.302488 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.302463 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:27:13.302601 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.302504 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 14:27:13.302601 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.302585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:27:13.302694 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.302642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 14:27:13.305528 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.305117 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-26wsd"] Apr 24 14:27:13.359686 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.359652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a033ef24-2d6c-4bf6-9271-934cc94deb41-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.359686 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.359688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a033ef24-2d6c-4bf6-9271-934cc94deb41-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.359952 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.359712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjz5f\" (UniqueName: \"kubernetes.io/projected/a033ef24-2d6c-4bf6-9271-934cc94deb41-kube-api-access-tjz5f\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.359952 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.359849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a033ef24-2d6c-4bf6-9271-934cc94deb41-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.460680 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.460649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a033ef24-2d6c-4bf6-9271-934cc94deb41-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.460680 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.460683 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a033ef24-2d6c-4bf6-9271-934cc94deb41-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.460977 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.460706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjz5f\" (UniqueName: \"kubernetes.io/projected/a033ef24-2d6c-4bf6-9271-934cc94deb41-kube-api-access-tjz5f\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.460977 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.460916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a033ef24-2d6c-4bf6-9271-934cc94deb41-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.461496 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.461475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a033ef24-2d6c-4bf6-9271-934cc94deb41-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.463080 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.463051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a033ef24-2d6c-4bf6-9271-934cc94deb41-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.463194 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.463136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a033ef24-2d6c-4bf6-9271-934cc94deb41-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.469956 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.469934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjz5f\" (UniqueName: \"kubernetes.io/projected/a033ef24-2d6c-4bf6-9271-934cc94deb41-kube-api-access-tjz5f\") pod \"prometheus-operator-5676c8c784-26wsd\" (UID: \"a033ef24-2d6c-4bf6-9271-934cc94deb41\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.608282 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.608248 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" Apr 24 14:27:13.743980 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.743946 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-26wsd"] Apr 24 14:27:13.748222 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:13.748183 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda033ef24_2d6c_4bf6_9271_934cc94deb41.slice/crio-3076d0201593d6c43d49e8b9ffd16622b389dee4a147a95dfc5a547b529486a9 WatchSource:0}: Error finding container 3076d0201593d6c43d49e8b9ffd16622b389dee4a147a95dfc5a547b529486a9: Status 404 returned error can't find the container with id 3076d0201593d6c43d49e8b9ffd16622b389dee4a147a95dfc5a547b529486a9 Apr 24 14:27:13.870808 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:13.870728 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" event={"ID":"a033ef24-2d6c-4bf6-9271-934cc94deb41","Type":"ContainerStarted","Data":"3076d0201593d6c43d49e8b9ffd16622b389dee4a147a95dfc5a547b529486a9"} Apr 24 14:27:14.344851 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:27:14.344748 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-49nxb" podUID="216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46" Apr 24 14:27:15.878857 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:15.878819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" event={"ID":"a033ef24-2d6c-4bf6-9271-934cc94deb41","Type":"ContainerStarted","Data":"0a0dba32d0adcf9b71afd0c426ce0e5af64a87e9951036926c9e4b375235d19d"} Apr 24 14:27:15.879269 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:15.878861 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" event={"ID":"a033ef24-2d6c-4bf6-9271-934cc94deb41","Type":"ContainerStarted","Data":"10013e672599059e2970ce79e809c9dba183159d14a2e72b9bb0ee5edd7bf359"} Apr 24 14:27:15.896767 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:15.896720 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-26wsd" podStartSLOduration=1.823877682 podStartE2EDuration="2.896706658s" podCreationTimestamp="2026-04-24 14:27:13 +0000 UTC" firstStartedPulling="2026-04-24 14:27:13.750023541 +0000 UTC m=+157.007925543" lastFinishedPulling="2026-04-24 14:27:14.82285251 +0000 UTC m=+158.080754519" observedRunningTime="2026-04-24 14:27:15.895056251 +0000 UTC m=+159.152958286" watchObservedRunningTime="2026-04-24 14:27:15.896706658 +0000 UTC m=+159.154608681" Apr 24 14:27:17.610462 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.610424 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" podUID="045bf3e9-8395-41bc-b778-6a5921eb1095" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 24 14:27:17.677154 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.677127 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-r6dqm"] Apr 24 14:27:17.680165 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.680144 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.682859 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.682825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:27:17.682859 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.682829 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6zzcr\"" Apr 24 14:27:17.683002 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.682987 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:27:17.683240 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.683223 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:27:17.698280 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.698251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:27:17.698356 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.698307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:27:17.700535 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.700516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a9c7a0-840a-4811-89fd-c85ae43af97f-metrics-tls\") pod \"dns-default-wp7sh\" (UID: \"a9a9c7a0-840a-4811-89fd-c85ae43af97f\") " pod="openshift-dns/dns-default-wp7sh" Apr 24 14:27:17.700636 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.700620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9c2869-28ce-47d7-a2d8-19e09fc982ba-cert\") pod \"ingress-canary-vsblc\" (UID: \"ce9c2869-28ce-47d7-a2d8-19e09fc982ba\") " pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:27:17.799641 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-textfile\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.799641 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xnpc\" (UniqueName: \"kubernetes.io/projected/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-kube-api-access-5xnpc\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.799641 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-root\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.799894 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-wtmp\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.799894 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-tls\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.799894 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-accelerators-collector-config\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.799894 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799846 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-sys\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.800039 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-metrics-client-ca\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.800039 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.799920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.885846 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.885814 2577 generic.go:358] "Generic (PLEG): container finished" podID="045bf3e9-8395-41bc-b778-6a5921eb1095" containerID="dbae6b6c5428581e79cce37bcc699e2ccedc099ca4ca2ca63eb0c11652f723b0" exitCode=1 Apr 24 14:27:17.886015 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.885896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" event={"ID":"045bf3e9-8395-41bc-b778-6a5921eb1095","Type":"ContainerDied","Data":"dbae6b6c5428581e79cce37bcc699e2ccedc099ca4ca2ca63eb0c11652f723b0"} Apr 24 14:27:17.886258 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.886236 2577 scope.go:117] "RemoveContainer" containerID="dbae6b6c5428581e79cce37bcc699e2ccedc099ca4ca2ca63eb0c11652f723b0" Apr 24 14:27:17.900355 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-tls\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900433 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-accelerators-collector-config\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900433 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-sys\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900433 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-metrics-client-ca\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900575 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900575 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-textfile\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900575 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:27:17.900453 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 14:27:17.900575 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-sys\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900575 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xnpc\" (UniqueName: \"kubernetes.io/projected/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-kube-api-access-5xnpc\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900575 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-root\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900881 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-wtmp\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900881 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-root\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900881 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.900751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-wtmp\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.900881 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:27:17.900758 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-tls podName:61fd49c2-35c9-478d-a0db-9c5d5a54b3db nodeName:}" failed. No retries permitted until 2026-04-24 14:27:18.400728313 +0000 UTC m=+161.658630327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-tls") pod "node-exporter-r6dqm" (UID: "61fd49c2-35c9-478d-a0db-9c5d5a54b3db") : secret "node-exporter-tls" not found Apr 24 14:27:17.901089 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.901076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-metrics-client-ca\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.901134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.901085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-textfile\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.901203 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.901183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-accelerators-collector-config\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.903356 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.903331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.909176 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.909154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xnpc\" (UniqueName: \"kubernetes.io/projected/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-kube-api-access-5xnpc\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:17.970254 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.970218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bxm2\"" Apr 24 14:27:17.970417 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.970289 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfc9t\"" Apr 24 14:27:17.978589 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.978561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wp7sh" Apr 24 14:27:17.978736 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:17.978674 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vsblc" Apr 24 14:27:18.110077 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.109996 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vsblc"] Apr 24 14:27:18.113467 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:18.113434 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce9c2869_28ce_47d7_a2d8_19e09fc982ba.slice/crio-bb2ed91150ce457405eca68bf51449ed18bee3cd5e325d8be019fea2bf286ac4 WatchSource:0}: Error finding container bb2ed91150ce457405eca68bf51449ed18bee3cd5e325d8be019fea2bf286ac4: Status 404 returned error can't find the container with id bb2ed91150ce457405eca68bf51449ed18bee3cd5e325d8be019fea2bf286ac4 Apr 24 14:27:18.122980 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.122957 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wp7sh"] Apr 24 14:27:18.125001 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:18.124976 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9a9c7a0_840a_4811_89fd_c85ae43af97f.slice/crio-54ee8aad3cca15d8aa11b01cfc31caa7946d90f72de996c605d2dcadc3292e79 WatchSource:0}: Error finding container 54ee8aad3cca15d8aa11b01cfc31caa7946d90f72de996c605d2dcadc3292e79: Status 404 returned error can't find the container with id 54ee8aad3cca15d8aa11b01cfc31caa7946d90f72de996c605d2dcadc3292e79 Apr 24 14:27:18.405649 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.405608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-tls\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:18.407840 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.407813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/61fd49c2-35c9-478d-a0db-9c5d5a54b3db-node-exporter-tls\") pod \"node-exporter-r6dqm\" (UID: \"61fd49c2-35c9-478d-a0db-9c5d5a54b3db\") " pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:18.589176 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.589141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r6dqm" Apr 24 14:27:18.601508 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:18.601474 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61fd49c2_35c9_478d_a0db_9c5d5a54b3db.slice/crio-9c06b06adddd62dceb551c04a2050b5f89807832e850ced6a0648cdac85a9ec6 WatchSource:0}: Error finding container 9c06b06adddd62dceb551c04a2050b5f89807832e850ced6a0648cdac85a9ec6: Status 404 returned error can't find the container with id 9c06b06adddd62dceb551c04a2050b5f89807832e850ced6a0648cdac85a9ec6 Apr 24 14:27:18.891044 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.890998 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vsblc" event={"ID":"ce9c2869-28ce-47d7-a2d8-19e09fc982ba","Type":"ContainerStarted","Data":"bb2ed91150ce457405eca68bf51449ed18bee3cd5e325d8be019fea2bf286ac4"} Apr 24 14:27:18.892670 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.892623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r6dqm" event={"ID":"61fd49c2-35c9-478d-a0db-9c5d5a54b3db","Type":"ContainerStarted","Data":"9c06b06adddd62dceb551c04a2050b5f89807832e850ced6a0648cdac85a9ec6"} Apr 24 14:27:18.895765 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.895735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" event={"ID":"045bf3e9-8395-41bc-b778-6a5921eb1095","Type":"ContainerStarted","Data":"70a8e5815b932e050ad9e30bcb45dbe1710c4e4a05602dac66550ca50ef9d6ef"} Apr 24 14:27:18.896566 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.896540 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:27:18.897095 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.897069 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b768cd975-x56qf" Apr 24 14:27:18.897564 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:18.897540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wp7sh" event={"ID":"a9a9c7a0-840a-4811-89fd-c85ae43af97f","Type":"ContainerStarted","Data":"54ee8aad3cca15d8aa11b01cfc31caa7946d90f72de996c605d2dcadc3292e79"} Apr 24 14:27:20.904684 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.904644 2577 generic.go:358] "Generic (PLEG): container finished" podID="61fd49c2-35c9-478d-a0db-9c5d5a54b3db" containerID="e4de214decd4464928e3de0b40778e3493d320ec2ea4a189551aa1ea2344b82e" exitCode=0 Apr 24 14:27:20.905148 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.904732 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r6dqm" event={"ID":"61fd49c2-35c9-478d-a0db-9c5d5a54b3db","Type":"ContainerDied","Data":"e4de214decd4464928e3de0b40778e3493d320ec2ea4a189551aa1ea2344b82e"} Apr 24 14:27:20.906257 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.906224 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wp7sh" event={"ID":"a9a9c7a0-840a-4811-89fd-c85ae43af97f","Type":"ContainerStarted","Data":"62632402a49c3c331dd24cf5b48fb4aa16dd2833e7573ad38c6e85bd745317a6"} Apr 24 14:27:20.906368 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.906261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wp7sh" event={"ID":"a9a9c7a0-840a-4811-89fd-c85ae43af97f","Type":"ContainerStarted","Data":"4486e6d46049b292c7073bf9fa678504f18c22550a5c025ec051953e5ca554b3"} Apr 24 14:27:20.906368 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.906350 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wp7sh" Apr 24 14:27:20.907641 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.907618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vsblc" event={"ID":"ce9c2869-28ce-47d7-a2d8-19e09fc982ba","Type":"ContainerStarted","Data":"0f86d4f0ecdd450a7399ba395fa5d7b793dbf1839dcf356d9454e87da2733586"} Apr 24 14:27:20.947322 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.947276 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wp7sh" podStartSLOduration=129.918131935 podStartE2EDuration="2m11.947261418s" podCreationTimestamp="2026-04-24 14:25:09 +0000 UTC" firstStartedPulling="2026-04-24 14:27:18.126806377 +0000 UTC m=+161.384708383" lastFinishedPulling="2026-04-24 14:27:20.155935857 +0000 UTC m=+163.413837866" observedRunningTime="2026-04-24 14:27:20.946155802 +0000 UTC m=+164.204057827" watchObservedRunningTime="2026-04-24 14:27:20.947261418 +0000 UTC m=+164.205163442" Apr 24 14:27:20.965812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:20.965759 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vsblc" podStartSLOduration=129.921677881 podStartE2EDuration="2m11.965740768s" podCreationTimestamp="2026-04-24 14:25:09 +0000 UTC" firstStartedPulling="2026-04-24 14:27:18.116185922 +0000 UTC m=+161.374087929" lastFinishedPulling="2026-04-24 14:27:20.16024881 +0000 UTC m=+163.418150816" observedRunningTime="2026-04-24 14:27:20.964146154 +0000 UTC m=+164.222048180" watchObservedRunningTime="2026-04-24 14:27:20.965740768 +0000 UTC m=+164.223642793" Apr 24 14:27:21.912537 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:21.912500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r6dqm" event={"ID":"61fd49c2-35c9-478d-a0db-9c5d5a54b3db","Type":"ContainerStarted","Data":"40c97969dff083535af71dcca41f66a073a14adfaa41843bab2338e9736d54fb"} Apr 24 14:27:21.912537 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:21.912538 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r6dqm" event={"ID":"61fd49c2-35c9-478d-a0db-9c5d5a54b3db","Type":"ContainerStarted","Data":"56b083c0f220c4881c3a795e982fd43a32d88d8475503977a91a4fab7f82d388"} Apr 24 14:27:21.938268 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:21.938208 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-r6dqm" podStartSLOduration=3.386082526 podStartE2EDuration="4.93819252s" podCreationTimestamp="2026-04-24 14:27:17 +0000 UTC" firstStartedPulling="2026-04-24 14:27:18.603936268 +0000 UTC m=+161.861838271" lastFinishedPulling="2026-04-24 14:27:20.156046262 +0000 UTC m=+163.413948265" observedRunningTime="2026-04-24 14:27:21.936257781 +0000 UTC m=+165.194159806" watchObservedRunningTime="2026-04-24 14:27:21.93819252 +0000 UTC m=+165.196094599" Apr 24 14:27:22.441909 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.441857 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw"] Apr 24 14:27:22.445340 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.445317 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" Apr 24 14:27:22.447986 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.447963 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 14:27:22.448108 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.447985 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lmkgb\"" Apr 24 14:27:22.453222 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.453202 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw"] Apr 24 14:27:22.546423 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.546381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/250c3bae-e7c3-466a-b034-c78f4a29c643-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zhnzw\" (UID: \"250c3bae-e7c3-466a-b034-c78f4a29c643\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" Apr 24 14:27:22.647634 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.647605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/250c3bae-e7c3-466a-b034-c78f4a29c643-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zhnzw\" (UID: \"250c3bae-e7c3-466a-b034-c78f4a29c643\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" Apr 24 14:27:22.649990 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.649960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/250c3bae-e7c3-466a-b034-c78f4a29c643-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zhnzw\" (UID: \"250c3bae-e7c3-466a-b034-c78f4a29c643\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" Apr 24 14:27:22.730713 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.730644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:27:22.755943 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.755909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" Apr 24 14:27:22.877862 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.875091 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw"] Apr 24 14:27:22.880751 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:22.880723 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250c3bae_e7c3_466a_b034_c78f4a29c643.slice/crio-0a67f13b7711f7492ed301aeecef97b7f26e03e4a67a5b21b8548ab0ca4a90e9 WatchSource:0}: Error finding container 0a67f13b7711f7492ed301aeecef97b7f26e03e4a67a5b21b8548ab0ca4a90e9: Status 404 returned error can't find the container with id 0a67f13b7711f7492ed301aeecef97b7f26e03e4a67a5b21b8548ab0ca4a90e9 Apr 24 14:27:22.916284 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:22.916242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" event={"ID":"250c3bae-e7c3-466a-b034-c78f4a29c643","Type":"ContainerStarted","Data":"0a67f13b7711f7492ed301aeecef97b7f26e03e4a67a5b21b8548ab0ca4a90e9"} Apr 24 14:27:23.080593 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.080518 2577 patch_prober.go:28] interesting pod/image-registry-6b7d4b657c-jctkr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 14:27:23.080593 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.080579 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" podUID="349f1bd8-8b03-47cc-8acb-0e815c249834" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:27:23.899343 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.899248 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:23.903851 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.903802 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:23.907591 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.907560 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 14:27:23.907731 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.907586 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 14:27:23.907731 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.907605 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 14:27:23.907850 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.907786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-l75rv\"" Apr 24 14:27:23.907850 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.907663 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 14:27:23.908187 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.908018 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 14:27:23.908187 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.908084 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 14:27:23.908741 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.908722 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 14:27:23.909121 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.908955 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 14:27:23.909121 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.908958 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 14:27:23.909121 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.909002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 14:27:23.909336 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.909100 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 14:27:23.909336 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.908961 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eshbdvbocvpb2\"" Apr 24 14:27:23.909336 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.909104 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 14:27:23.910829 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.910810 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 14:27:23.917953 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:23.917217 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:24.063888 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.063834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064078 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.063903 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk8nq\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-kube-api-access-jk8nq\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064078 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.063970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064078 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.063998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064239 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064235 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064446 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-config\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064446 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-web-config\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064446 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064446 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064446 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064669 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064669 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064669 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-config-out\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.064669 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.064628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166050 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.165974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166050 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-config-out\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166254 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166254 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166254 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jk8nq\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-kube-api-access-jk8nq\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166254 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166480 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166480 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166480 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166480 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166480 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166480 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166820 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-config\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166820 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-web-config\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166820 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166820 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166820 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.166820 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.167127 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.166908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.168388 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.167963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.168848 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.168820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.169787 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.169763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.169962 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.169937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-config-out\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.170405 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.170379 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.170738 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.170711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.170832 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.170800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.171209 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.171186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.172351 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.172309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.173087 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.173025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.173087 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.173041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.174440 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.174385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.175260 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.174722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.175260 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.175162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-web-config\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.175260 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.175169 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-config\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.175957 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.175937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.177543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.177513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk8nq\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-kube-api-access-jk8nq\") pod \"prometheus-k8s-0\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.220880 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.220827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:24.363594 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.363560 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:24.368401 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:27:24.368374 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda45fbccd_7681_49be_bfd1_9f56c8c33268.slice/crio-17571d5977a1ef1a1521a1696111895dee99c64dd0be6e51963c199607f2c6ed WatchSource:0}: Error finding container 17571d5977a1ef1a1521a1696111895dee99c64dd0be6e51963c199607f2c6ed: Status 404 returned error can't find the container with id 17571d5977a1ef1a1521a1696111895dee99c64dd0be6e51963c199607f2c6ed Apr 24 14:27:24.851361 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.851333 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b7d4b657c-jctkr" Apr 24 14:27:24.925738 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.925685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerStarted","Data":"17571d5977a1ef1a1521a1696111895dee99c64dd0be6e51963c199607f2c6ed"} Apr 24 14:27:24.927157 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.927090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" event={"ID":"250c3bae-e7c3-466a-b034-c78f4a29c643","Type":"ContainerStarted","Data":"0d0243177267a0acdd25eeb7fd37bb98180e62c5631f4be19b80900aeed1d9fd"} Apr 24 14:27:24.928321 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.928164 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" Apr 24 14:27:24.937257 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.937232 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" Apr 24 14:27:24.948198 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:24.948145 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zhnzw" podStartSLOduration=1.603719642 podStartE2EDuration="2.948128205s" podCreationTimestamp="2026-04-24 14:27:22 +0000 UTC" firstStartedPulling="2026-04-24 14:27:22.882484516 +0000 UTC m=+166.140386518" lastFinishedPulling="2026-04-24 14:27:24.226893078 +0000 UTC m=+167.484795081" observedRunningTime="2026-04-24 14:27:24.947014359 +0000 UTC m=+168.204916384" watchObservedRunningTime="2026-04-24 14:27:24.948128205 +0000 UTC m=+168.206030231" Apr 24 14:27:25.931522 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:25.931490 2577 generic.go:358] "Generic (PLEG): container finished" podID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerID="d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898" exitCode=0 Apr 24 14:27:25.931899 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:25.931573 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898"} Apr 24 14:27:27.746356 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:27.746311 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" podUID="ad2ecded-9941-4d85-be6f-1a36e7e4229a" containerName="registry" containerID="cri-o://9a5c1f9311c7f33872ac73d7786cda10f1ac422ee588053006b1bba52cec357a" gracePeriod=30 Apr 24 14:27:27.940020 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:27.939988 2577 generic.go:358] "Generic (PLEG): container finished" podID="ad2ecded-9941-4d85-be6f-1a36e7e4229a" containerID="9a5c1f9311c7f33872ac73d7786cda10f1ac422ee588053006b1bba52cec357a" exitCode=0 Apr 24 14:27:27.940152 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:27.940059 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" event={"ID":"ad2ecded-9941-4d85-be6f-1a36e7e4229a","Type":"ContainerDied","Data":"9a5c1f9311c7f33872ac73d7786cda10f1ac422ee588053006b1bba52cec357a"} Apr 24 14:27:28.326559 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:28.326514 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:27:28.908020 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:28.907995 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:27:28.945279 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:28.945235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" event={"ID":"ad2ecded-9941-4d85-be6f-1a36e7e4229a","Type":"ContainerDied","Data":"1082a6bb709af022dbf184ca357ba514b482a66aca60323ce50a2d71346d1498"} Apr 24 14:27:28.945279 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:28.945277 2577 scope.go:117] "RemoveContainer" containerID="9a5c1f9311c7f33872ac73d7786cda10f1ac422ee588053006b1bba52cec357a" Apr 24 14:27:28.945520 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:28.945313 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bbcccc89f-jszxr" Apr 24 14:27:29.016280 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016241 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z58dj\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-kube-api-access-z58dj\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.016435 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016296 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.016435 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016325 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-image-registry-private-configuration\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.016435 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016341 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-bound-sa-token\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.016435 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016369 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-trusted-ca\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.016435 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016386 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-certificates\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.016435 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016414 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-installation-pull-secrets\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.016709 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.016444 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad2ecded-9941-4d85-be6f-1a36e7e4229a-ca-trust-extracted\") pod \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\" (UID: \"ad2ecded-9941-4d85-be6f-1a36e7e4229a\") " Apr 24 14:27:29.017129 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.017088 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:29.017646 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.017587 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:29.019550 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.019520 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:29.019645 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.019546 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:29.019818 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.019795 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:29.020062 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.020042 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-kube-api-access-z58dj" (OuterVolumeSpecName: "kube-api-access-z58dj") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "kube-api-access-z58dj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:29.020201 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.020188 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:29.026280 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.026256 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2ecded-9941-4d85-be6f-1a36e7e4229a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ad2ecded-9941-4d85-be6f-1a36e7e4229a" (UID: "ad2ecded-9941-4d85-be6f-1a36e7e4229a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117193 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-trusted-ca\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117226 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-certificates\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117241 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-installation-pull-secrets\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117255 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad2ecded-9941-4d85-be6f-1a36e7e4229a-ca-trust-extracted\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117269 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z58dj\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-kube-api-access-z58dj\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117282 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-registry-tls\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117294 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad2ecded-9941-4d85-be6f-1a36e7e4229a-image-registry-private-configuration\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.117328 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.117308 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad2ecded-9941-4d85-be6f-1a36e7e4229a-bound-sa-token\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:27:29.271922 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.271889 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bbcccc89f-jszxr"] Apr 24 14:27:29.278137 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.278110 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bbcccc89f-jszxr"] Apr 24 14:27:29.329753 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.329721 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2ecded-9941-4d85-be6f-1a36e7e4229a" path="/var/lib/kubelet/pods/ad2ecded-9941-4d85-be6f-1a36e7e4229a/volumes" Apr 24 14:27:29.950947 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.950911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerStarted","Data":"3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba"} Apr 24 14:27:29.951427 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:29.950955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerStarted","Data":"67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d"} Apr 24 14:27:30.915368 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:30.915341 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wp7sh" Apr 24 14:27:30.958705 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:30.958567 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerStarted","Data":"2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850"} Apr 24 14:27:31.963884 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:31.963826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerStarted","Data":"10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e"} Apr 24 14:27:31.963884 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:31.963880 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerStarted","Data":"bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29"} Apr 24 14:27:31.964280 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:31.963895 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerStarted","Data":"8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734"} Apr 24 14:27:31.999762 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:31.999711 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.538581021 podStartE2EDuration="8.999697114s" podCreationTimestamp="2026-04-24 14:27:23 +0000 UTC" firstStartedPulling="2026-04-24 14:27:24.372379756 +0000 UTC m=+167.630281757" lastFinishedPulling="2026-04-24 14:27:30.833495848 +0000 UTC m=+174.091397850" observedRunningTime="2026-04-24 14:27:31.997398457 +0000 UTC m=+175.255300505" watchObservedRunningTime="2026-04-24 14:27:31.999697114 +0000 UTC m=+175.257599138" Apr 24 14:27:34.221716 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:34.221664 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:50.015380 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:50.015342 2577 generic.go:358] "Generic (PLEG): container finished" podID="03102919-727f-42a2-8035-d799d73184d6" containerID="1f102b4fc6029a7045ed17d1c404a87be414fdc3cf4109a3cfccc6f5852d115e" exitCode=0 Apr 24 14:27:50.015802 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:50.015418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" event={"ID":"03102919-727f-42a2-8035-d799d73184d6","Type":"ContainerDied","Data":"1f102b4fc6029a7045ed17d1c404a87be414fdc3cf4109a3cfccc6f5852d115e"} Apr 24 14:27:50.015802 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:50.015717 2577 scope.go:117] "RemoveContainer" containerID="1f102b4fc6029a7045ed17d1c404a87be414fdc3cf4109a3cfccc6f5852d115e" Apr 24 14:27:51.019595 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:27:51.019558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vzfcf" event={"ID":"03102919-727f-42a2-8035-d799d73184d6","Type":"ContainerStarted","Data":"ee22b741a9813830291402312985f9b13c95a5f90cfbd600733b636078396ceb"} Apr 24 14:28:09.070537 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:09.070502 2577 generic.go:358] "Generic (PLEG): container finished" podID="5bf1887d-7526-48ef-aa33-2cf15cf8ced2" containerID="ba3c31c79ed5c6f054ba7de24bb41bc88cbf82286cc6d2206b6e33db07603cad" exitCode=0 Apr 24 14:28:09.070928 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:09.070562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" event={"ID":"5bf1887d-7526-48ef-aa33-2cf15cf8ced2","Type":"ContainerDied","Data":"ba3c31c79ed5c6f054ba7de24bb41bc88cbf82286cc6d2206b6e33db07603cad"} Apr 24 14:28:09.070928 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:09.070846 2577 scope.go:117] "RemoveContainer" containerID="ba3c31c79ed5c6f054ba7de24bb41bc88cbf82286cc6d2206b6e33db07603cad" Apr 24 14:28:10.074492 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:10.074455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8fpnr" event={"ID":"5bf1887d-7526-48ef-aa33-2cf15cf8ced2","Type":"ContainerStarted","Data":"270b0c607126a263ffd5effdab1cf3189bd855fbbd20dae2ef9c3d82b35eafe7"} Apr 24 14:28:24.221679 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:24.221636 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:24.241574 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:24.241544 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:25.132972 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:25.132945 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:42.298566 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:42.298529 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:28:42.299192 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:42.299163 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy" containerID="cri-o://8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734" gracePeriod=600 Apr 24 14:28:42.299417 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:42.299151 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="prometheus" containerID="cri-o://67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d" gracePeriod=600 Apr 24 14:28:42.299544 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:42.299200 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-thanos" containerID="cri-o://bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29" gracePeriod=600 Apr 24 14:28:42.299544 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:42.299205 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-web" containerID="cri-o://10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e" gracePeriod=600 Apr 24 14:28:42.299544 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:42.299209 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="config-reloader" containerID="cri-o://3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba" gracePeriod=600 Apr 24 14:28:42.299707 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:42.299242 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="thanos-sidecar" containerID="cri-o://2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850" gracePeriod=600 Apr 24 14:28:43.176683 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176647 2577 generic.go:358] "Generic (PLEG): container finished" podID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerID="bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29" exitCode=0 Apr 24 14:28:43.176683 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176672 2577 generic.go:358] "Generic (PLEG): container finished" podID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerID="8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734" exitCode=0 Apr 24 14:28:43.176683 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176680 2577 generic.go:358] "Generic (PLEG): container finished" podID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerID="2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850" exitCode=0 Apr 24 14:28:43.176683 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176686 2577 generic.go:358] "Generic (PLEG): container finished" podID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerID="3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba" exitCode=0 Apr 24 14:28:43.176683 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176692 2577 generic.go:358] "Generic (PLEG): container finished" podID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerID="67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d" exitCode=0 Apr 24 14:28:43.177007 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176719 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29"} Apr 24 14:28:43.177007 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176753 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734"} Apr 24 14:28:43.177007 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850"} Apr 24 14:28:43.177007 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba"} Apr 24 14:28:43.177007 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.176781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d"} Apr 24 14:28:43.542277 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.542257 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:43.631315 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631284 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-thanos-prometheus-http-client-file\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631315 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631323 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-tls-assets\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631353 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-db\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631389 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-kube-rbac-proxy\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631422 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-config\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631449 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-web-config\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631481 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631511 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-grpc-tls\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631554 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631541 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-kubelet-serving-ca-bundle\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631568 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631603 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-config-out\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631647 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk8nq\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-kube-api-access-jk8nq\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631671 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-metrics-client-certs\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631697 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-trusted-ca-bundle\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631722 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-metrics-client-ca\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631759 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-tls\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631784 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-serving-certs-ca-bundle\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.631942 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.631824 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-rulefiles-0\") pod \"a45fbccd-7681-49be-bfd1-9f56c8c33268\" (UID: \"a45fbccd-7681-49be-bfd1-9f56c8c33268\") " Apr 24 14:28:43.632343 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.632224 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:43.633072 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.633035 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:43.633540 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.633147 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:28:43.633540 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.633405 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:43.633540 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.633474 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:43.634345 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.634318 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.634450 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.634381 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:43.634665 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.634638 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-config-out" (OuterVolumeSpecName: "config-out") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:28:43.635353 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.635314 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.636024 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.636002 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-config" (OuterVolumeSpecName: "config") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.636565 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.636525 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.636565 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.636556 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:28:43.636710 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.636623 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-kube-api-access-jk8nq" (OuterVolumeSpecName: "kube-api-access-jk8nq") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "kube-api-access-jk8nq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:28:43.636710 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.636655 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.636814 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.636784 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.637068 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.637050 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.637759 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.637727 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.645126 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.645102 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-web-config" (OuterVolumeSpecName: "web-config") pod "a45fbccd-7681-49be-bfd1-9f56c8c33268" (UID: "a45fbccd-7681-49be-bfd1-9f56c8c33268"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:43.732812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732731 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.732812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732770 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.732812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732781 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.732812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732791 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.732812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732801 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-tls-assets\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.732812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732810 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-k8s-db\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.732812 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732819 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-kube-rbac-proxy\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732828 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-config\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732836 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-web-config\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732845 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732853 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-grpc-tls\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732862 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732909 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732922 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a45fbccd-7681-49be-bfd1-9f56c8c33268-config-out\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732933 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jk8nq\" (UniqueName: \"kubernetes.io/projected/a45fbccd-7681-49be-bfd1-9f56c8c33268-kube-api-access-jk8nq\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732942 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a45fbccd-7681-49be-bfd1-9f56c8c33268-secret-metrics-client-certs\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732951 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:43.733134 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:43.732959 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a45fbccd-7681-49be-bfd1-9f56c8c33268-configmap-metrics-client-ca\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:28:44.184709 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.184672 2577 generic.go:358] "Generic (PLEG): container finished" podID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerID="10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e" exitCode=0 Apr 24 14:28:44.184898 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.184735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e"} Apr 24 14:28:44.184898 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.184771 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a45fbccd-7681-49be-bfd1-9f56c8c33268","Type":"ContainerDied","Data":"17571d5977a1ef1a1521a1696111895dee99c64dd0be6e51963c199607f2c6ed"} Apr 24 14:28:44.184898 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.184784 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.185057 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.184788 2577 scope.go:117] "RemoveContainer" containerID="bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29" Apr 24 14:28:44.193682 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.193557 2577 scope.go:117] "RemoveContainer" containerID="8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734" Apr 24 14:28:44.200372 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.200351 2577 scope.go:117] "RemoveContainer" containerID="10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e" Apr 24 14:28:44.206555 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.206537 2577 scope.go:117] "RemoveContainer" containerID="2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850" Apr 24 14:28:44.211217 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.211195 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:28:44.212818 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.212801 2577 scope.go:117] "RemoveContainer" containerID="3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba" Apr 24 14:28:44.217909 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.217888 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:28:44.219170 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.219146 2577 scope.go:117] "RemoveContainer" containerID="67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d" Apr 24 14:28:44.225455 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.225425 2577 scope.go:117] "RemoveContainer" containerID="d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898" Apr 24 14:28:44.231295 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.231279 2577 scope.go:117] "RemoveContainer" containerID="bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29" Apr 24 14:28:44.231528 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:28:44.231511 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29\": container with ID starting with bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29 not found: ID does not exist" containerID="bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29" Apr 24 14:28:44.231586 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.231535 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29"} err="failed to get container status \"bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29\": rpc error: code = NotFound desc = could not find container \"bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29\": container with ID starting with bbdfda24362bde343b0e4818d8398839365f7b5f466d2b6e380d0fe0fa2dbf29 not found: ID does not exist" Apr 24 14:28:44.231586 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.231564 2577 scope.go:117] "RemoveContainer" containerID="8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734" Apr 24 14:28:44.231766 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:28:44.231750 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734\": container with ID starting with 8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734 not found: ID does not exist" containerID="8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734" Apr 24 14:28:44.231808 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.231770 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734"} err="failed to get container status \"8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734\": rpc error: code = NotFound desc = could not find container \"8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734\": container with ID starting with 8224ee355c7602e176bd6ca0107a444463c299f7856f2bc71e5c766e046b7734 not found: ID does not exist" Apr 24 14:28:44.231808 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.231783 2577 scope.go:117] "RemoveContainer" containerID="10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e" Apr 24 14:28:44.231982 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:28:44.231965 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e\": container with ID starting with 10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e not found: ID does not exist" containerID="10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e" Apr 24 14:28:44.232026 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.231985 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e"} err="failed to get container status \"10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e\": rpc error: code = NotFound desc = could not find container \"10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e\": container with ID starting with 10ceae241a8a83d3fced7c741fa19b5ecd00e1bd9ad0cb86f4e98959d0ee4b5e not found: ID does not exist" Apr 24 14:28:44.232026 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.231998 2577 scope.go:117] "RemoveContainer" containerID="2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850" Apr 24 14:28:44.232205 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:28:44.232191 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850\": container with ID starting with 2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850 not found: ID does not exist" containerID="2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850" Apr 24 14:28:44.232246 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.232209 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850"} err="failed to get container status \"2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850\": rpc error: code = NotFound desc = could not find container \"2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850\": container with ID starting with 2b0f36f5ed976fac28a578ffb4e3b2fd2741d2a4f01d54b5dc5eaea14e838850 not found: ID does not exist" Apr 24 14:28:44.232246 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.232221 2577 scope.go:117] "RemoveContainer" containerID="3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba" Apr 24 14:28:44.232399 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:28:44.232381 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba\": container with ID starting with 3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba not found: ID does not exist" containerID="3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba" Apr 24 14:28:44.232442 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.232401 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba"} err="failed to get container status \"3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba\": rpc error: code = NotFound desc = could not find container \"3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba\": container with ID starting with 3927280e21fcdb7c0820ffcf674b939d63d3ec8785f15b660e7ae32efc7e17ba not found: ID does not exist" Apr 24 14:28:44.232442 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.232414 2577 scope.go:117] "RemoveContainer" containerID="67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d" Apr 24 14:28:44.232562 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:28:44.232547 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d\": container with ID starting with 67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d not found: ID does not exist" containerID="67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d" Apr 24 14:28:44.232611 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.232563 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d"} err="failed to get container status \"67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d\": rpc error: code = NotFound desc = could not find container \"67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d\": container with ID starting with 67d73c42e459b2d331e31447c078bf3d5b509d3c061958eba4ba4783ce9ff97d not found: ID does not exist" Apr 24 14:28:44.232611 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.232573 2577 scope.go:117] "RemoveContainer" containerID="d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898" Apr 24 14:28:44.232721 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:28:44.232706 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898\": container with ID starting with d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898 not found: ID does not exist" containerID="d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898" Apr 24 14:28:44.232760 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.232723 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898"} err="failed to get container status \"d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898\": rpc error: code = NotFound desc = could not find container \"d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898\": container with ID starting with d12ed9e951586b68f6a13f93255ff81afd9bfc2026cd3eb5810645e7831e5898 not found: ID does not exist" Apr 24 14:28:44.249862 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.249840 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:28:44.250114 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250101 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2ecded-9941-4d85-be6f-1a36e7e4229a" containerName="registry" Apr 24 14:28:44.250159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250116 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2ecded-9941-4d85-be6f-1a36e7e4229a" containerName="registry" Apr 24 14:28:44.250159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250126 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy" Apr 24 14:28:44.250159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250132 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy" Apr 24 14:28:44.250159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250143 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-web" Apr 24 14:28:44.250159 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250151 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-web" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250167 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="config-reloader" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250174 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="config-reloader" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250179 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="thanos-sidecar" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250184 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="thanos-sidecar" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250191 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="init-config-reloader" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250196 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="init-config-reloader" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250205 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="prometheus" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250210 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="prometheus" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250219 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-thanos" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250227 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-thanos" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250267 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-web" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250276 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="config-reloader" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250282 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250297 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="thanos-sidecar" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250307 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad2ecded-9941-4d85-be6f-1a36e7e4229a" containerName="registry" Apr 24 14:28:44.250310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250316 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="prometheus" Apr 24 14:28:44.250754 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.250322 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" containerName="kube-rbac-proxy-thanos" Apr 24 14:28:44.255271 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.255255 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258213 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258263 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258309 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-l75rv\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258324 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258311 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eshbdvbocvpb2\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258309 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258407 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258434 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258455 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258477 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 14:28:44.258543 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.258436 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 14:28:44.259327 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.259290 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 14:28:44.259406 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.259352 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 14:28:44.261530 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.261511 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 14:28:44.264126 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.264106 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 14:28:44.270275 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.270255 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:28:44.336495 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336467 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336495 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7eaa547e-cb49-4949-809e-8725c2e3ce11-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336639 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvg6p\" (UniqueName: \"kubernetes.io/projected/7eaa547e-cb49-4949-809e-8725c2e3ce11-kube-api-access-pvg6p\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336676 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-web-config\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7eaa547e-cb49-4949-809e-8725c2e3ce11-config-out\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-config\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.336917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.336882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438061 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.437983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438061 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438226 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7eaa547e-cb49-4949-809e-8725c2e3ce11-config-out\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438261 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-config\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438308 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438350 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438350 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438440 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.438440 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.438986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7eaa547e-cb49-4949-809e-8725c2e3ce11-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439131 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439751 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439751 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvg6p\" (UniqueName: \"kubernetes.io/projected/7eaa547e-cb49-4949-809e-8725c2e3ce11-kube-api-access-pvg6p\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439751 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-web-config\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.439751 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.439467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.441734 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.441417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-config\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.441734 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.441417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.441734 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.441681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.441989 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.441970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.442045 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.442013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-web-config\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.442147 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.442063 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.442147 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.442084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.442360 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.442334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7eaa547e-cb49-4949-809e-8725c2e3ce11-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.442762 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.442740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7eaa547e-cb49-4949-809e-8725c2e3ce11-config-out\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.443001 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.442982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.444377 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.444348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7eaa547e-cb49-4949-809e-8725c2e3ce11-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.444461 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.444418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.444561 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.444539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.444662 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.444645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7eaa547e-cb49-4949-809e-8725c2e3ce11-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.452465 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.452445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvg6p\" (UniqueName: \"kubernetes.io/projected/7eaa547e-cb49-4949-809e-8725c2e3ce11-kube-api-access-pvg6p\") pod \"prometheus-k8s-0\" (UID: \"7eaa547e-cb49-4949-809e-8725c2e3ce11\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.565330 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.565280 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:44.694487 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:44.694465 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:28:44.696897 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:28:44.696852 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eaa547e_cb49_4949_809e_8725c2e3ce11.slice/crio-d389681804e4016186d3abfcd98c144a15673b9f9ca46f62456359e0efa565f8 WatchSource:0}: Error finding container d389681804e4016186d3abfcd98c144a15673b9f9ca46f62456359e0efa565f8: Status 404 returned error can't find the container with id d389681804e4016186d3abfcd98c144a15673b9f9ca46f62456359e0efa565f8 Apr 24 14:28:45.188827 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:45.188793 2577 generic.go:358] "Generic (PLEG): container finished" podID="7eaa547e-cb49-4949-809e-8725c2e3ce11" containerID="8cea9b55135a20e5bd1d38cf08ff874e486a8d38890da51b89731cea257e404d" exitCode=0 Apr 24 14:28:45.189011 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:45.188844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerDied","Data":"8cea9b55135a20e5bd1d38cf08ff874e486a8d38890da51b89731cea257e404d"} Apr 24 14:28:45.189011 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:45.188893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerStarted","Data":"d389681804e4016186d3abfcd98c144a15673b9f9ca46f62456359e0efa565f8"} Apr 24 14:28:45.331137 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:45.331097 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45fbccd-7681-49be-bfd1-9f56c8c33268" path="/var/lib/kubelet/pods/a45fbccd-7681-49be-bfd1-9f56c8c33268/volumes" Apr 24 14:28:46.194806 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:46.194771 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerStarted","Data":"f73d790af8330f7a765e7847f2aeac3fdd98fb399bac95d51118f42c2d16533e"} Apr 24 14:28:46.194806 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:46.194805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerStarted","Data":"35012dd012e17af36c698d16306388101900be06aee4e4167cea25cce1ddfd08"} Apr 24 14:28:46.194806 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:46.194814 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerStarted","Data":"51c7aa32e86c84b7a38e098237b648a7da6174fb99a3e064055efa1f6b7e9eac"} Apr 24 14:28:46.195258 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:46.194823 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerStarted","Data":"49c3c4902730ed9b8fea96b6f322bb559462a66c7a97f8947ef28f76a48cb32a"} Apr 24 14:28:46.195258 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:46.194831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerStarted","Data":"544d11d9388bc25c77581e90fdbbdb8ada180c5a8da46bd1496f63d77e64d324"} Apr 24 14:28:46.195258 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:46.194841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7eaa547e-cb49-4949-809e-8725c2e3ce11","Type":"ContainerStarted","Data":"66e7e10ce2a089f368ef82cd06eb74a2b8fa9977d81b236145e93dae75b370ef"} Apr 24 14:28:46.225405 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:46.225349 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.225334161 podStartE2EDuration="2.225334161s" podCreationTimestamp="2026-04-24 14:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:28:46.22322926 +0000 UTC m=+249.481131284" watchObservedRunningTime="2026-04-24 14:28:46.225334161 +0000 UTC m=+249.483236185" Apr 24 14:28:49.076014 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:49.075974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:28:49.078281 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:49.078257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46-metrics-certs\") pod \"network-metrics-daemon-49nxb\" (UID: \"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46\") " pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:28:49.331660 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:49.331576 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-crqvq\"" Apr 24 14:28:49.337776 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:49.337751 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-49nxb" Apr 24 14:28:49.454152 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:49.454120 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-49nxb"] Apr 24 14:28:49.456761 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:28:49.456728 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod216f65fd_4d9d_4b6a_a1e1_a8bbfe802a46.slice/crio-f21616385786b2ad0c4a43fce315e0b7c5ff7093fa713b5d23c1a1208af17ed5 WatchSource:0}: Error finding container f21616385786b2ad0c4a43fce315e0b7c5ff7093fa713b5d23c1a1208af17ed5: Status 404 returned error can't find the container with id f21616385786b2ad0c4a43fce315e0b7c5ff7093fa713b5d23c1a1208af17ed5 Apr 24 14:28:49.565795 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:49.565761 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:50.207950 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:50.207905 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-49nxb" event={"ID":"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46","Type":"ContainerStarted","Data":"f21616385786b2ad0c4a43fce315e0b7c5ff7093fa713b5d23c1a1208af17ed5"} Apr 24 14:28:51.213744 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:51.213702 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-49nxb" event={"ID":"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46","Type":"ContainerStarted","Data":"03e11d105bfa34b38c961838abd5ee7b632b327cf7d4150325e87f84feb64e07"} Apr 24 14:28:51.213744 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:51.213742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-49nxb" event={"ID":"216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46","Type":"ContainerStarted","Data":"6020f8ebb7f44e04d6f1c3ddba901c3ee12f0e3de691445182f2042bcfe67b8e"} Apr 24 14:28:51.231747 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:28:51.231695 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-49nxb" podStartSLOduration=253.293781257 podStartE2EDuration="4m14.231678982s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:28:49.458714233 +0000 UTC m=+252.716616235" lastFinishedPulling="2026-04-24 14:28:50.396611959 +0000 UTC m=+253.654513960" observedRunningTime="2026-04-24 14:28:51.229893887 +0000 UTC m=+254.487795911" watchObservedRunningTime="2026-04-24 14:28:51.231678982 +0000 UTC m=+254.489581005" Apr 24 14:29:37.225280 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:29:37.225258 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:29:37.225654 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:29:37.225261 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:29:37.228310 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:29:37.228292 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:29:44.566047 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:29:44.566009 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:29:44.580996 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:29:44.580970 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:29:45.372018 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:29:45.371989 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:30:50.344889 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.344831 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mn8sp"] Apr 24 14:30:50.348226 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.348202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.350818 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.350799 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:30:50.355702 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.355681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mn8sp"] Apr 24 14:30:50.494753 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.494718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4128e61-3f80-4f61-b6bc-6602a75f282a-original-pull-secret\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.494930 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.494783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4128e61-3f80-4f61-b6bc-6602a75f282a-dbus\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.494930 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.494828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4128e61-3f80-4f61-b6bc-6602a75f282a-kubelet-config\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.595553 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.595509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4128e61-3f80-4f61-b6bc-6602a75f282a-dbus\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.595747 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.595579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4128e61-3f80-4f61-b6bc-6602a75f282a-kubelet-config\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.595747 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.595606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4128e61-3f80-4f61-b6bc-6602a75f282a-original-pull-secret\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.595747 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.595704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4128e61-3f80-4f61-b6bc-6602a75f282a-dbus\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.595747 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.595722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4128e61-3f80-4f61-b6bc-6602a75f282a-kubelet-config\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.597814 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.597797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4128e61-3f80-4f61-b6bc-6602a75f282a-original-pull-secret\") pod \"global-pull-secret-syncer-mn8sp\" (UID: \"a4128e61-3f80-4f61-b6bc-6602a75f282a\") " pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.658248 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.658212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mn8sp" Apr 24 14:30:50.774220 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.774193 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mn8sp"] Apr 24 14:30:50.776686 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:30:50.776661 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4128e61_3f80_4f61_b6bc_6602a75f282a.slice/crio-9dd6a41a4b9a2ba166433a6f94d731ac11be69d7da6c633f30d8f909aec0b6a9 WatchSource:0}: Error finding container 9dd6a41a4b9a2ba166433a6f94d731ac11be69d7da6c633f30d8f909aec0b6a9: Status 404 returned error can't find the container with id 9dd6a41a4b9a2ba166433a6f94d731ac11be69d7da6c633f30d8f909aec0b6a9 Apr 24 14:30:50.778282 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:50.778255 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:30:51.528102 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:51.528047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mn8sp" event={"ID":"a4128e61-3f80-4f61-b6bc-6602a75f282a","Type":"ContainerStarted","Data":"9dd6a41a4b9a2ba166433a6f94d731ac11be69d7da6c633f30d8f909aec0b6a9"} Apr 24 14:30:55.540490 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:55.540454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mn8sp" event={"ID":"a4128e61-3f80-4f61-b6bc-6602a75f282a","Type":"ContainerStarted","Data":"eedebc394c98ed5b839c8ac5921fa683c9d02fd9ef8aa913d7d57f41fcfe209e"} Apr 24 14:30:55.557773 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:30:55.557723 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mn8sp" podStartSLOduration=1.7112520820000001 podStartE2EDuration="5.557710429s" podCreationTimestamp="2026-04-24 14:30:50 +0000 UTC" firstStartedPulling="2026-04-24 14:30:50.778388131 +0000 UTC m=+374.036290133" lastFinishedPulling="2026-04-24 14:30:54.624846476 +0000 UTC m=+377.882748480" observedRunningTime="2026-04-24 14:30:55.556553901 +0000 UTC m=+378.814455938" watchObservedRunningTime="2026-04-24 14:30:55.557710429 +0000 UTC m=+378.815612452" Apr 24 14:34:17.006941 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.006906 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jrhsp"] Apr 24 14:34:17.009140 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.009122 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jrhsp" Apr 24 14:34:17.013258 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.013229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:34:17.013392 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.013279 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 14:34:17.014051 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.014029 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-kcg4b\"" Apr 24 14:34:17.014146 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.014093 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:34:17.018546 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.018526 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jrhsp"] Apr 24 14:34:17.102227 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.102196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkqzd\" (UniqueName: \"kubernetes.io/projected/008c9bee-f8cc-4110-9592-a86a22503c7e-kube-api-access-lkqzd\") pod \"s3-init-jrhsp\" (UID: \"008c9bee-f8cc-4110-9592-a86a22503c7e\") " pod="kserve/s3-init-jrhsp" Apr 24 14:34:17.203010 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.202973 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkqzd\" (UniqueName: \"kubernetes.io/projected/008c9bee-f8cc-4110-9592-a86a22503c7e-kube-api-access-lkqzd\") pod \"s3-init-jrhsp\" (UID: \"008c9bee-f8cc-4110-9592-a86a22503c7e\") " pod="kserve/s3-init-jrhsp" Apr 24 14:34:17.213338 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.213307 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkqzd\" (UniqueName: \"kubernetes.io/projected/008c9bee-f8cc-4110-9592-a86a22503c7e-kube-api-access-lkqzd\") pod \"s3-init-jrhsp\" (UID: \"008c9bee-f8cc-4110-9592-a86a22503c7e\") " pod="kserve/s3-init-jrhsp" Apr 24 14:34:17.327755 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.327680 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jrhsp" Apr 24 14:34:17.461677 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:17.461644 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jrhsp"] Apr 24 14:34:18.096918 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:18.096878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jrhsp" event={"ID":"008c9bee-f8cc-4110-9592-a86a22503c7e","Type":"ContainerStarted","Data":"084a2b81fb6895d99d703af56495ab956b4f5c5d30380554303acd0e36943ad0"} Apr 24 14:34:22.109478 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:22.109438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jrhsp" event={"ID":"008c9bee-f8cc-4110-9592-a86a22503c7e","Type":"ContainerStarted","Data":"c879ace22ef4181fe2a0d9ea17fb5ad93f933eccffc572ec9b34d2c92217e176"} Apr 24 14:34:22.128072 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:22.128018 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jrhsp" podStartSLOduration=1.7817936300000001 podStartE2EDuration="6.127999849s" podCreationTimestamp="2026-04-24 14:34:16 +0000 UTC" firstStartedPulling="2026-04-24 14:34:17.466704113 +0000 UTC m=+580.724606128" lastFinishedPulling="2026-04-24 14:34:21.81291033 +0000 UTC m=+585.070812347" observedRunningTime="2026-04-24 14:34:22.127318828 +0000 UTC m=+585.385220852" watchObservedRunningTime="2026-04-24 14:34:22.127999849 +0000 UTC m=+585.385901874" Apr 24 14:34:25.119015 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:25.118979 2577 generic.go:358] "Generic (PLEG): container finished" podID="008c9bee-f8cc-4110-9592-a86a22503c7e" containerID="c879ace22ef4181fe2a0d9ea17fb5ad93f933eccffc572ec9b34d2c92217e176" exitCode=0 Apr 24 14:34:25.119368 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:25.119027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jrhsp" event={"ID":"008c9bee-f8cc-4110-9592-a86a22503c7e","Type":"ContainerDied","Data":"c879ace22ef4181fe2a0d9ea17fb5ad93f933eccffc572ec9b34d2c92217e176"} Apr 24 14:34:26.239953 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:26.239926 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jrhsp" Apr 24 14:34:26.371527 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:26.371490 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkqzd\" (UniqueName: \"kubernetes.io/projected/008c9bee-f8cc-4110-9592-a86a22503c7e-kube-api-access-lkqzd\") pod \"008c9bee-f8cc-4110-9592-a86a22503c7e\" (UID: \"008c9bee-f8cc-4110-9592-a86a22503c7e\") " Apr 24 14:34:26.373637 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:26.373579 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008c9bee-f8cc-4110-9592-a86a22503c7e-kube-api-access-lkqzd" (OuterVolumeSpecName: "kube-api-access-lkqzd") pod "008c9bee-f8cc-4110-9592-a86a22503c7e" (UID: "008c9bee-f8cc-4110-9592-a86a22503c7e"). InnerVolumeSpecName "kube-api-access-lkqzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:34:26.472107 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:26.472069 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lkqzd\" (UniqueName: \"kubernetes.io/projected/008c9bee-f8cc-4110-9592-a86a22503c7e-kube-api-access-lkqzd\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:34:27.125198 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:27.125166 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jrhsp" event={"ID":"008c9bee-f8cc-4110-9592-a86a22503c7e","Type":"ContainerDied","Data":"084a2b81fb6895d99d703af56495ab956b4f5c5d30380554303acd0e36943ad0"} Apr 24 14:34:27.125198 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:27.125188 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jrhsp" Apr 24 14:34:27.125198 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:27.125201 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="084a2b81fb6895d99d703af56495ab956b4f5c5d30380554303acd0e36943ad0" Apr 24 14:34:37.251728 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:37.251702 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:34:37.252175 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:34:37.251702 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:39:37.270540 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:39:37.270511 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:39:37.273945 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:39:37.273924 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:44:37.292682 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:44:37.292650 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:44:37.294348 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:44:37.294327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:48:15.800412 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.800373 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zhrkl/must-gather-z46r7"] Apr 24 14:48:15.800967 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.800682 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="008c9bee-f8cc-4110-9592-a86a22503c7e" containerName="s3-init" Apr 24 14:48:15.800967 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.800694 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="008c9bee-f8cc-4110-9592-a86a22503c7e" containerName="s3-init" Apr 24 14:48:15.800967 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.800762 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="008c9bee-f8cc-4110-9592-a86a22503c7e" containerName="s3-init" Apr 24 14:48:15.803741 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.803723 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:15.806256 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.806234 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zhrkl\"/\"openshift-service-ca.crt\"" Apr 24 14:48:15.807512 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.807491 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zhrkl\"/\"default-dockercfg-77rm7\"" Apr 24 14:48:15.807512 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.807504 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zhrkl\"/\"kube-root-ca.crt\"" Apr 24 14:48:15.811534 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.811512 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zhrkl/must-gather-z46r7"] Apr 24 14:48:15.924286 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.924252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6b5f487-96fd-4a4d-b11c-528142efa532-must-gather-output\") pod \"must-gather-z46r7\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:15.924286 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:15.924290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qd9k\" (UniqueName: \"kubernetes.io/projected/d6b5f487-96fd-4a4d-b11c-528142efa532-kube-api-access-2qd9k\") pod \"must-gather-z46r7\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:16.025160 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.025122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6b5f487-96fd-4a4d-b11c-528142efa532-must-gather-output\") pod \"must-gather-z46r7\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:16.025160 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.025164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qd9k\" (UniqueName: \"kubernetes.io/projected/d6b5f487-96fd-4a4d-b11c-528142efa532-kube-api-access-2qd9k\") pod \"must-gather-z46r7\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:16.025527 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.025503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6b5f487-96fd-4a4d-b11c-528142efa532-must-gather-output\") pod \"must-gather-z46r7\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:16.035075 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.035049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qd9k\" (UniqueName: \"kubernetes.io/projected/d6b5f487-96fd-4a4d-b11c-528142efa532-kube-api-access-2qd9k\") pod \"must-gather-z46r7\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:16.120703 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.120674 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:16.242965 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.242926 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zhrkl/must-gather-z46r7"] Apr 24 14:48:16.245892 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:48:16.245851 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b5f487_96fd_4a4d_b11c_528142efa532.slice/crio-2bd84c0211b8f910d8a46e8c1a4956a30d4de76380f4da69cf5b885793fac3a1 WatchSource:0}: Error finding container 2bd84c0211b8f910d8a46e8c1a4956a30d4de76380f4da69cf5b885793fac3a1: Status 404 returned error can't find the container with id 2bd84c0211b8f910d8a46e8c1a4956a30d4de76380f4da69cf5b885793fac3a1 Apr 24 14:48:16.247521 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.247504 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:48:16.392469 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:16.392374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zhrkl/must-gather-z46r7" event={"ID":"d6b5f487-96fd-4a4d-b11c-528142efa532","Type":"ContainerStarted","Data":"2bd84c0211b8f910d8a46e8c1a4956a30d4de76380f4da69cf5b885793fac3a1"} Apr 24 14:48:22.427623 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:22.427574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zhrkl/must-gather-z46r7" event={"ID":"d6b5f487-96fd-4a4d-b11c-528142efa532","Type":"ContainerStarted","Data":"903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64"} Apr 24 14:48:22.427623 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:22.427630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zhrkl/must-gather-z46r7" event={"ID":"d6b5f487-96fd-4a4d-b11c-528142efa532","Type":"ContainerStarted","Data":"2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e"} Apr 24 14:48:22.445664 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:22.445613 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zhrkl/must-gather-z46r7" podStartSLOduration=2.108198937 podStartE2EDuration="7.445598172s" podCreationTimestamp="2026-04-24 14:48:15 +0000 UTC" firstStartedPulling="2026-04-24 14:48:16.247632371 +0000 UTC m=+1419.505534373" lastFinishedPulling="2026-04-24 14:48:21.585031599 +0000 UTC m=+1424.842933608" observedRunningTime="2026-04-24 14:48:22.443575806 +0000 UTC m=+1425.701477832" watchObservedRunningTime="2026-04-24 14:48:22.445598172 +0000 UTC m=+1425.703500196" Apr 24 14:48:39.485105 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:39.485064 2577 generic.go:358] "Generic (PLEG): container finished" podID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerID="2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e" exitCode=0 Apr 24 14:48:39.485648 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:39.485116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zhrkl/must-gather-z46r7" event={"ID":"d6b5f487-96fd-4a4d-b11c-528142efa532","Type":"ContainerDied","Data":"2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e"} Apr 24 14:48:39.485648 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:39.485528 2577 scope.go:117] "RemoveContainer" containerID="2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e" Apr 24 14:48:39.774274 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:39.774157 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zhrkl_must-gather-z46r7_d6b5f487-96fd-4a4d-b11c-528142efa532/gather/0.log" Apr 24 14:48:43.532045 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:43.532006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mn8sp_a4128e61-3f80-4f61-b6bc-6602a75f282a/global-pull-secret-syncer/0.log" Apr 24 14:48:43.731535 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:43.731499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wt2bk_9bb73f96-473b-43fe-89d5-d7ba4f64faf2/konnectivity-agent/0.log" Apr 24 14:48:43.760219 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:43.760172 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-82.ec2.internal_20602652b465b1663ffa2d71aa406f85/haproxy/0.log" Apr 24 14:48:45.136284 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.136244 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zhrkl/must-gather-z46r7"] Apr 24 14:48:45.136681 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.136482 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-zhrkl/must-gather-z46r7" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerName="copy" containerID="cri-o://903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64" gracePeriod=2 Apr 24 14:48:45.140721 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.140533 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zhrkl/must-gather-z46r7"] Apr 24 14:48:45.358262 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.358235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zhrkl_must-gather-z46r7_d6b5f487-96fd-4a4d-b11c-528142efa532/copy/0.log" Apr 24 14:48:45.358623 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.358607 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:45.451526 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.451432 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6b5f487-96fd-4a4d-b11c-528142efa532-must-gather-output\") pod \"d6b5f487-96fd-4a4d-b11c-528142efa532\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " Apr 24 14:48:45.451526 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.451484 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qd9k\" (UniqueName: \"kubernetes.io/projected/d6b5f487-96fd-4a4d-b11c-528142efa532-kube-api-access-2qd9k\") pod \"d6b5f487-96fd-4a4d-b11c-528142efa532\" (UID: \"d6b5f487-96fd-4a4d-b11c-528142efa532\") " Apr 24 14:48:45.452839 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.452814 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b5f487-96fd-4a4d-b11c-528142efa532-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d6b5f487-96fd-4a4d-b11c-528142efa532" (UID: "d6b5f487-96fd-4a4d-b11c-528142efa532"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:48:45.453621 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.453600 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b5f487-96fd-4a4d-b11c-528142efa532-kube-api-access-2qd9k" (OuterVolumeSpecName: "kube-api-access-2qd9k") pod "d6b5f487-96fd-4a4d-b11c-528142efa532" (UID: "d6b5f487-96fd-4a4d-b11c-528142efa532"). InnerVolumeSpecName "kube-api-access-2qd9k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:48:45.503167 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.503138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zhrkl_must-gather-z46r7_d6b5f487-96fd-4a4d-b11c-528142efa532/copy/0.log" Apr 24 14:48:45.503490 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.503467 2577 generic.go:358] "Generic (PLEG): container finished" podID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerID="903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64" exitCode=143 Apr 24 14:48:45.503535 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.503515 2577 scope.go:117] "RemoveContainer" containerID="903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64" Apr 24 14:48:45.503535 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.503518 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zhrkl/must-gather-z46r7" Apr 24 14:48:45.510986 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.510955 2577 scope.go:117] "RemoveContainer" containerID="2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e" Apr 24 14:48:45.525153 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.525107 2577 scope.go:117] "RemoveContainer" containerID="903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64" Apr 24 14:48:45.526082 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:48:45.526057 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64\": container with ID starting with 903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64 not found: ID does not exist" containerID="903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64" Apr 24 14:48:45.526214 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.526093 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64"} err="failed to get container status \"903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64\": rpc error: code = NotFound desc = could not find container \"903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64\": container with ID starting with 903f232f58fbb4d7e341ec762903c91de8f0d826aab31268a479c24994597b64 not found: ID does not exist" Apr 24 14:48:45.526214 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.526120 2577 scope.go:117] "RemoveContainer" containerID="2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e" Apr 24 14:48:45.526505 ip-10-0-134-82 kubenswrapper[2577]: E0424 14:48:45.526434 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e\": container with ID starting with 2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e not found: ID does not exist" containerID="2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e" Apr 24 14:48:45.526505 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.526475 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e"} err="failed to get container status \"2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e\": rpc error: code = NotFound desc = could not find container \"2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e\": container with ID starting with 2feb0752d4219fe5de52e673e59a0f320fc20e020f6c64ea8ed3f13afaab214e not found: ID does not exist" Apr 24 14:48:45.553495 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.552984 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6b5f487-96fd-4a4d-b11c-528142efa532-must-gather-output\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:48:45.553495 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:45.553021 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qd9k\" (UniqueName: \"kubernetes.io/projected/d6b5f487-96fd-4a4d-b11c-528142efa532-kube-api-access-2qd9k\") on node \"ip-10-0-134-82.ec2.internal\" DevicePath \"\"" Apr 24 14:48:47.330839 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.330760 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" path="/var/lib/kubelet/pods/d6b5f487-96fd-4a4d-b11c-528142efa532/volumes" Apr 24 14:48:47.494791 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.494762 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zhnzw_250c3bae-e7c3-466a-b034-c78f4a29c643/monitoring-plugin/0.log" Apr 24 14:48:47.624204 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.624175 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r6dqm_61fd49c2-35c9-478d-a0db-9c5d5a54b3db/node-exporter/0.log" Apr 24 14:48:47.648154 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.648125 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r6dqm_61fd49c2-35c9-478d-a0db-9c5d5a54b3db/kube-rbac-proxy/0.log" Apr 24 14:48:47.672592 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.672569 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r6dqm_61fd49c2-35c9-478d-a0db-9c5d5a54b3db/init-textfile/0.log" Apr 24 14:48:47.913945 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.913839 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7eaa547e-cb49-4949-809e-8725c2e3ce11/prometheus/0.log" Apr 24 14:48:47.934800 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.934774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7eaa547e-cb49-4949-809e-8725c2e3ce11/config-reloader/0.log" Apr 24 14:48:47.960544 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.960516 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7eaa547e-cb49-4949-809e-8725c2e3ce11/thanos-sidecar/0.log" Apr 24 14:48:47.986301 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:47.986272 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7eaa547e-cb49-4949-809e-8725c2e3ce11/kube-rbac-proxy-web/0.log" Apr 24 14:48:48.014379 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:48.014351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7eaa547e-cb49-4949-809e-8725c2e3ce11/kube-rbac-proxy/0.log" Apr 24 14:48:48.042647 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:48.042620 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7eaa547e-cb49-4949-809e-8725c2e3ce11/kube-rbac-proxy-thanos/0.log" Apr 24 14:48:48.069186 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:48.069159 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7eaa547e-cb49-4949-809e-8725c2e3ce11/init-config-reloader/0.log" Apr 24 14:48:48.104794 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:48.104767 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-26wsd_a033ef24-2d6c-4bf6-9271-934cc94deb41/prometheus-operator/0.log" Apr 24 14:48:48.127450 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:48.127425 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-26wsd_a033ef24-2d6c-4bf6-9271-934cc94deb41/kube-rbac-proxy/0.log" Apr 24 14:48:50.562175 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.562136 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d"] Apr 24 14:48:50.562647 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.562562 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerName="gather" Apr 24 14:48:50.562647 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.562581 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerName="gather" Apr 24 14:48:50.562647 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.562591 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerName="copy" Apr 24 14:48:50.562647 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.562599 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerName="copy" Apr 24 14:48:50.562851 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.562670 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerName="gather" Apr 24 14:48:50.562851 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.562687 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6b5f487-96fd-4a4d-b11c-528142efa532" containerName="copy" Apr 24 14:48:50.566503 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.566482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.569062 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.569036 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7d7dd\"/\"openshift-service-ca.crt\"" Apr 24 14:48:50.569062 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.569050 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7d7dd\"/\"kube-root-ca.crt\"" Apr 24 14:48:50.570202 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.570182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7d7dd\"/\"default-dockercfg-64fdx\"" Apr 24 14:48:50.574970 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.574946 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d"] Apr 24 14:48:50.697845 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.697805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-lib-modules\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.698047 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.697853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drvk\" (UniqueName: \"kubernetes.io/projected/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-kube-api-access-9drvk\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.698047 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.697954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-sys\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.698047 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.697993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-proc\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.698047 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.698015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-podres\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799194 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-lib-modules\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799194 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9drvk\" (UniqueName: \"kubernetes.io/projected/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-kube-api-access-9drvk\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799410 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-sys\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799410 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-lib-modules\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799410 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-proc\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799410 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-podres\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799410 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799394 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-sys\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799570 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-podres\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.799570 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.799478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-proc\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.810971 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.810942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drvk\" (UniqueName: \"kubernetes.io/projected/c11d8cab-c38d-4b04-a3d3-06ef52dbbca2-kube-api-access-9drvk\") pod \"perf-node-gather-daemonset-6744d\" (UID: \"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.876627 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.876592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:50.996699 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:50.996671 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d"] Apr 24 14:48:50.999228 ip-10-0-134-82 kubenswrapper[2577]: W0424 14:48:50.999196 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc11d8cab_c38d_4b04_a3d3_06ef52dbbca2.slice/crio-3a79ecac9ef5a324a97e5503089924699ed5da9cb842cc90e36c7064d1074456 WatchSource:0}: Error finding container 3a79ecac9ef5a324a97e5503089924699ed5da9cb842cc90e36c7064d1074456: Status 404 returned error can't find the container with id 3a79ecac9ef5a324a97e5503089924699ed5da9cb842cc90e36c7064d1074456 Apr 24 14:48:51.522671 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:51.522581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" event={"ID":"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2","Type":"ContainerStarted","Data":"da73b922a4ff3a867354bedf147228e8118515dd353a7d773d2ad9c6f64dbd8b"} Apr 24 14:48:51.522671 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:51.522627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" event={"ID":"c11d8cab-c38d-4b04-a3d3-06ef52dbbca2","Type":"ContainerStarted","Data":"3a79ecac9ef5a324a97e5503089924699ed5da9cb842cc90e36c7064d1074456"} Apr 24 14:48:51.522857 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:51.522697 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:48:51.539524 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:51.539474 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" podStartSLOduration=1.539459787 podStartE2EDuration="1.539459787s" podCreationTimestamp="2026-04-24 14:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:48:51.538099238 +0000 UTC m=+1454.796001262" watchObservedRunningTime="2026-04-24 14:48:51.539459787 +0000 UTC m=+1454.797361810" Apr 24 14:48:51.690235 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:51.690200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wp7sh_a9a9c7a0-840a-4811-89fd-c85ae43af97f/dns/0.log" Apr 24 14:48:51.716309 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:51.716271 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wp7sh_a9a9c7a0-840a-4811-89fd-c85ae43af97f/kube-rbac-proxy/0.log" Apr 24 14:48:51.852583 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:51.852558 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w4bb5_5c3fa93e-ebcf-4859-a30a-a5dfd8bd28e8/dns-node-resolver/0.log" Apr 24 14:48:52.288456 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:52.288377 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6b7d4b657c-jctkr_349f1bd8-8b03-47cc-8acb-0e815c249834/registry/0.log" Apr 24 14:48:52.349188 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:52.349157 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7nph4_9bac24c0-4d8b-4f25-88b3-6d4cebc649bf/node-ca/0.log" Apr 24 14:48:53.606766 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:53.606732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vsblc_ce9c2869-28ce-47d7-a2d8-19e09fc982ba/serve-healthcheck-canary/0.log" Apr 24 14:48:54.198764 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:54.198732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xkrr7_ff15be00-d8eb-41e8-a1d4-e126f8a91dc6/kube-rbac-proxy/0.log" Apr 24 14:48:54.224693 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:54.224665 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xkrr7_ff15be00-d8eb-41e8-a1d4-e126f8a91dc6/exporter/0.log" Apr 24 14:48:54.251444 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:54.251412 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xkrr7_ff15be00-d8eb-41e8-a1d4-e126f8a91dc6/extractor/0.log" Apr 24 14:48:56.459917 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:56.459881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jrhsp_008c9bee-f8cc-4110-9592-a86a22503c7e/s3-init/0.log" Apr 24 14:48:57.535133 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:48:57.535107 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-6744d" Apr 24 14:49:00.778755 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:00.778721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kxnmm_7b62e7c5-2bcb-415c-ad16-77c9c85fa204/migrator/0.log" Apr 24 14:49:00.803398 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:00.803362 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kxnmm_7b62e7c5-2bcb-415c-ad16-77c9c85fa204/graceful-termination/0.log" Apr 24 14:49:01.138785 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:01.138755 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8fpnr_5bf1887d-7526-48ef-aa33-2cf15cf8ced2/kube-storage-version-migrator-operator/1.log" Apr 24 14:49:01.140335 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:01.140300 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8fpnr_5bf1887d-7526-48ef-aa33-2cf15cf8ced2/kube-storage-version-migrator-operator/0.log" Apr 24 14:49:02.347593 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.347569 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kplkp_66176040-5bda-46d8-aaba-ef37c25ad37e/kube-multus-additional-cni-plugins/0.log" Apr 24 14:49:02.377349 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.377311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kplkp_66176040-5bda-46d8-aaba-ef37c25ad37e/egress-router-binary-copy/0.log" Apr 24 14:49:02.408378 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.408343 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kplkp_66176040-5bda-46d8-aaba-ef37c25ad37e/cni-plugins/0.log" Apr 24 14:49:02.434412 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.434379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kplkp_66176040-5bda-46d8-aaba-ef37c25ad37e/bond-cni-plugin/0.log" Apr 24 14:49:02.463732 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.463703 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kplkp_66176040-5bda-46d8-aaba-ef37c25ad37e/routeoverride-cni/0.log" Apr 24 14:49:02.492407 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.492377 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kplkp_66176040-5bda-46d8-aaba-ef37c25ad37e/whereabouts-cni-bincopy/0.log" Apr 24 14:49:02.518612 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.518580 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kplkp_66176040-5bda-46d8-aaba-ef37c25ad37e/whereabouts-cni/0.log" Apr 24 14:49:02.777317 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.777225 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p6pq2_0ce9a076-3b33-4c2e-b35e-6fb3cf4fce67/kube-multus/0.log" Apr 24 14:49:02.835093 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.835058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-49nxb_216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46/network-metrics-daemon/0.log" Apr 24 14:49:02.864713 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:02.864679 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-49nxb_216f65fd-4d9d-4b6a-a1e1-a8bbfe802a46/kube-rbac-proxy/0.log" Apr 24 14:49:04.566578 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.566546 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-controller/0.log" Apr 24 14:49:04.589695 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.589667 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/0.log" Apr 24 14:49:04.602514 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.602488 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovn-acl-logging/1.log" Apr 24 14:49:04.628080 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.628054 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/kube-rbac-proxy-node/0.log" Apr 24 14:49:04.655371 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.655334 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 14:49:04.677796 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.677770 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/northd/0.log" Apr 24 14:49:04.703471 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.703446 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/nbdb/0.log" Apr 24 14:49:04.730506 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.730469 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/sbdb/0.log" Apr 24 14:49:04.986701 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:04.986658 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdc2t_48c500e5-ff8b-4e0c-bdda-745035b2e024/ovnkube-controller/0.log" Apr 24 14:49:06.142351 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:06.142321 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rn5ql_442ed584-8835-435d-8b83-97804ed0f554/network-check-target-container/0.log" Apr 24 14:49:07.139144 ip-10-0-134-82 kubenswrapper[2577]: I0424 14:49:07.139113 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4fxh7_c100dd1e-57a3-471e-998a-d002af692c13/iptables-alerter/0.log"