Apr 24 19:04:32.104932 ip-10-0-129-23 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 19:04:32.104946 ip-10-0-129-23 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 19:04:32.104956 ip-10-0-129-23 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 19:04:32.105292 ip-10-0-129-23 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 19:04:42.124741 ip-10-0-129-23 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 19:04:42.124761 ip-10-0-129-23 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 40d8cd8131ef4fe99916cfe9d1bb9770 -- Apr 24 19:07:15.232232 ip-10-0-129-23 systemd[1]: Starting Kubernetes Kubelet... Apr 24 19:07:15.718701 ip-10-0-129-23 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:07:15.718701 ip-10-0-129-23 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 19:07:15.718701 ip-10-0-129-23 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:07:15.718701 ip-10-0-129-23 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 19:07:15.718701 ip-10-0-129-23 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:07:15.722770 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.722680 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 19:07:15.725906 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725891 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:15.725906 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725907 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725911 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725915 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725918 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725921 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725924 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725927 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725930 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725933 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725936 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725939 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725942 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725944 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725947 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725960 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725964 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725968 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725972 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725975 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:15.725981 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725978 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725981 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725985 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725990 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725993 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725996 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.725999 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726001 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726005 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726008 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726010 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726013 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726016 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726019 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726021 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726024 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726028 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726031 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726033 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:15.726478 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726036 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726038 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726041 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726043 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726046 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726048 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726051 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726053 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726056 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726059 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726061 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726064 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726066 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726069 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726071 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726075 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726078 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726082 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726084 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726087 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:15.727102 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726089 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726092 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726095 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726098 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726100 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726103 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726105 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726108 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726111 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726113 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726116 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726119 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726121 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726124 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726126 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726129 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726131 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726134 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726136 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726139 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:15.727772 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726141 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726144 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726146 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726149 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726152 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726154 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726158 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726723 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726735 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726747 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726752 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726757 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726761 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726765 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726770 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726774 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726779 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726783 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726787 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726792 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:15.728465 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726796 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726801 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726812 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726816 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726821 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726825 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726829 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726834 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726838 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726843 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726847 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726851 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726855 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726859 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726864 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726874 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726878 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726883 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726888 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:15.729065 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726892 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726897 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726902 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726906 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726912 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726916 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726921 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726931 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726935 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726940 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726944 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726948 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726968 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726972 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726976 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726981 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726986 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726991 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.726995 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727008 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:15.729782 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727014 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727019 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727024 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727028 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727032 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727037 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727041 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727045 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727050 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727054 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727058 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727067 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727072 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727077 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727081 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727086 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727090 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727094 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727099 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:15.730663 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727104 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727108 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727115 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727121 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727131 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727136 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727141 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727146 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727151 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727155 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727161 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727166 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727170 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727174 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.727179 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.728557 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.728690 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729325 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729334 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729341 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729347 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 19:07:15.731188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729354 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729362 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729368 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729373 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729378 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729384 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729389 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729394 2568 flags.go:64] FLAG: --cgroup-root="" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729398 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729404 2568 flags.go:64] FLAG: --client-ca-file="" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729408 2568 flags.go:64] FLAG: --cloud-config="" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729413 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729418 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729427 2568 flags.go:64] FLAG: --cluster-domain="" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729431 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729437 2568 flags.go:64] FLAG: --config-dir="" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729442 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729448 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729454 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729458 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729463 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729469 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729473 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729479 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 19:07:15.731876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729485 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729491 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729496 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729504 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729510 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729514 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729519 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729524 2568 flags.go:64] FLAG: --enable-server="true" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729528 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729535 2568 flags.go:64] FLAG: --event-burst="100" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729540 2568 flags.go:64] FLAG: --event-qps="50" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729545 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729550 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729554 2568 flags.go:64] FLAG: --eviction-hard="" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729561 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729566 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729571 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729576 2568 flags.go:64] FLAG: --eviction-soft="" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729581 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729586 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729590 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729595 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729600 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729604 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729609 2568 flags.go:64] FLAG: --feature-gates="" Apr 24 19:07:15.732725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729614 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729620 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729625 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729630 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729635 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729640 2568 flags.go:64] FLAG: --help="false" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729645 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-129-23.ec2.internal" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729651 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729657 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729662 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729669 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729677 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729682 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729687 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729691 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729696 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729701 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729706 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729711 2568 flags.go:64] FLAG: --kube-reserved="" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729716 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729721 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729726 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729731 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729735 2568 flags.go:64] FLAG: --lock-file="" Apr 24 19:07:15.733354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729740 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729745 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729750 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729759 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729764 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729768 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729773 2568 flags.go:64] FLAG: --logging-format="text" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729778 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729783 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729787 2568 flags.go:64] FLAG: --manifest-url="" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729792 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729799 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729804 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729811 2568 flags.go:64] FLAG: --max-pods="110" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729816 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729821 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729826 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729831 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729837 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729842 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729847 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729860 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729865 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729870 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 19:07:15.734045 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729875 2568 flags.go:64] FLAG: --pod-cidr="" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729880 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729888 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729892 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729897 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729902 2568 flags.go:64] FLAG: --port="10250" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729907 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729912 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0135e0e11dd3ded57" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729917 2568 flags.go:64] FLAG: --qos-reserved="" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729922 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729928 2568 flags.go:64] FLAG: --register-node="true" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729932 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729938 2568 flags.go:64] FLAG: --register-with-taints="" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729943 2568 flags.go:64] FLAG: --registry-burst="10" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729948 2568 flags.go:64] FLAG: --registry-qps="5" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729968 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729974 2568 flags.go:64] FLAG: --reserved-memory="" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729980 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729985 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729990 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.729995 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730000 2568 flags.go:64] FLAG: --runonce="false" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730004 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730010 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730015 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 24 19:07:15.734627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730019 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730024 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730029 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730034 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730060 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730067 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730072 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730077 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730082 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730087 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730092 2568 flags.go:64] FLAG: --system-cgroups="" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730097 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730107 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730112 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730117 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730125 2568 flags.go:64] FLAG: --tls-min-version="" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730130 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730134 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730139 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730144 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730149 2568 flags.go:64] FLAG: --v="2" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730155 2568 flags.go:64] FLAG: --version="false" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730161 2568 flags.go:64] FLAG: --vmodule="" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730169 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.730174 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 19:07:15.735241 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730325 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730333 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730339 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730344 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730349 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730354 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730359 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730363 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730368 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730372 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730376 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730381 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730385 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730390 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730396 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730410 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730415 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730420 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730424 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:15.735845 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730428 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730432 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730436 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730441 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730444 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730448 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730452 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730457 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730461 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730466 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730470 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730474 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730478 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730485 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730491 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730496 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730501 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730505 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730510 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:15.736355 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730514 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730519 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730523 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730527 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730531 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730536 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730540 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730544 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730548 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730553 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730557 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730561 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730566 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730570 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730574 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730578 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730582 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730587 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730591 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730595 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:15.736884 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730599 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730603 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730607 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730611 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730615 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730620 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730625 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730629 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730634 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730638 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730642 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730646 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730651 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730655 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730659 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730664 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730667 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730671 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730675 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730678 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:15.737396 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730682 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730686 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730690 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730695 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730700 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730705 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730709 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.730713 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:15.737879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.731382 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:07:15.738666 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.738648 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 19:07:15.738666 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.738666 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 19:07:15.738724 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738711 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:15.738724 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738716 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:15.738724 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738720 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:15.738724 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738724 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738727 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738731 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738734 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738737 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738740 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738743 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738746 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738748 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738751 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738755 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738757 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738760 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738763 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738765 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738768 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738771 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738773 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738776 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738778 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:15.738822 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738781 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738784 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738786 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738789 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738791 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738794 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738797 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738800 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738802 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738805 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738808 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738811 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738813 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738816 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738819 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738822 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738825 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738827 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738830 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738833 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:15.739333 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738836 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738838 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738841 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738844 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738847 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738849 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738852 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738855 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738857 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738860 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738862 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738865 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738867 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738870 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738873 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738875 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738878 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738882 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738885 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:15.739816 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738888 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738891 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738893 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738896 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738899 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738901 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738903 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738906 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738909 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738912 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738914 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738917 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738919 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738922 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738925 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738928 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738931 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738937 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738940 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738943 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:15.740330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738945 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738948 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738965 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.738969 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.738974 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739068 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739072 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739075 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739078 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739080 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739083 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739086 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739089 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739092 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739094 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:15.740818 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739097 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739100 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739102 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739105 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739108 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739110 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739113 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739115 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739118 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739121 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739123 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739126 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739128 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739131 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739134 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739137 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739139 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739142 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739144 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:15.741300 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739146 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739149 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739151 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739154 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739156 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739159 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739162 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739165 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739168 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739170 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739173 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739176 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739180 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739183 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739186 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739189 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739192 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739195 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739198 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:15.741757 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739200 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739202 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739205 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739208 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739210 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739212 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739215 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739217 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739220 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739222 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739225 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739228 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739230 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739233 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739235 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739237 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739240 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739260 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739264 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739267 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:15.742230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739269 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739272 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739275 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739279 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739281 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739284 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739286 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739289 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739291 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739294 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739296 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739299 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739301 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739304 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739307 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739309 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739312 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:15.742728 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:15.739314 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:15.743153 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.739319 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:07:15.743153 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.740032 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 19:07:15.743539 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.743526 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 19:07:15.744836 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.744825 2568 server.go:1019] "Starting client certificate rotation" Apr 24 19:07:15.744935 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.744924 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:07:15.744982 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.744969 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:07:15.770252 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.770235 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:07:15.773513 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.773492 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:07:15.794735 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.794716 2568 log.go:25] "Validated CRI v1 runtime API" Apr 24 19:07:15.797552 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.797535 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:07:15.800564 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.800547 2568 log.go:25] "Validated CRI v1 image API" Apr 24 19:07:15.801751 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.801736 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 19:07:15.807921 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.807898 2568 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 aa85a697-f6f0-4eb3-ba08-53c02cea617a:/dev/nvme0n1p3 ca96f3d8-59c3-4e25-bb65-6a53a4838746:/dev/nvme0n1p4] Apr 24 19:07:15.807996 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.807921 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 19:07:15.814660 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.814559 2568 manager.go:217] Machine: {Timestamp:2026-04-24 19:07:15.811879217 +0000 UTC m=+0.450999069 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499996 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f123a30a6e455064da58c84509b1b SystemUUID:ec2f123a-30a6-e455-064d-a58c84509b1b BootID:40d8cd81-31ef-4fe9-9916-cfe9d1bb9770 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c9:db:6b:c0:cb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c9:db:6b:c0:cb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:88:5a:8a:58:74 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 19:07:15.814660 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.814655 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 19:07:15.814766 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.814731 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 19:07:15.815897 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.815872 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 19:07:15.816069 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.815899 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-23.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 19:07:15.816120 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.816079 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 19:07:15.816120 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.816087 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 19:07:15.816120 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.816100 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:07:15.817012 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.817001 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:07:15.818378 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.818368 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:07:15.818484 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.818474 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 19:07:15.820482 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.820463 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lppjk" Apr 24 19:07:15.821434 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.821424 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 24 19:07:15.821470 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.821440 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 19:07:15.821470 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.821456 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 19:07:15.821470 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.821465 2568 kubelet.go:397] "Adding apiserver pod source" Apr 24 19:07:15.821598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.821474 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 19:07:15.823028 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.823014 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:07:15.823105 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.823032 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:07:15.826141 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.826126 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 19:07:15.827592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.827575 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lppjk" Apr 24 19:07:15.828048 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.828035 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 19:07:15.829452 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829438 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 19:07:15.829495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829461 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 19:07:15.829495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829471 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 19:07:15.829495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829480 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 19:07:15.829495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829489 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 19:07:15.829598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829498 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 19:07:15.829598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829507 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 19:07:15.829598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829515 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 19:07:15.829598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829526 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 19:07:15.829598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829535 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 19:07:15.829598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829557 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 19:07:15.829598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.829570 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 19:07:15.831503 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.831490 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 19:07:15.831548 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.831504 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 19:07:15.834504 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.834491 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:15.835487 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.835475 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 19:07:15.835527 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.835513 2568 server.go:1295] "Started kubelet" Apr 24 19:07:15.835663 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.835620 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 19:07:15.835723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.835678 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 19:07:15.836040 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.836004 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 19:07:15.836359 ip-10-0-129-23 systemd[1]: Started Kubernetes Kubelet. Apr 24 19:07:15.836537 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.836514 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:15.836892 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.836850 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 19:07:15.843004 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.842981 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-23.ec2.internal" not found Apr 24 19:07:15.843531 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.843514 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 24 19:07:15.848244 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.848215 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 19:07:15.848485 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:15.848466 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 19:07:15.848728 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.848709 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 19:07:15.849432 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849416 2568 factory.go:55] Registering systemd factory Apr 24 19:07:15.849515 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849442 2568 factory.go:223] Registration of the systemd container factory successfully Apr 24 19:07:15.849650 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849631 2568 factory.go:153] Registering CRI-O factory Apr 24 19:07:15.849650 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849647 2568 factory.go:223] Registration of the crio container factory successfully Apr 24 19:07:15.849761 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849695 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 19:07:15.849761 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849725 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 19:07:15.849761 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849739 2568 factory.go:103] Registering Raw factory Apr 24 19:07:15.849761 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849752 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 19:07:15.849894 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849755 2568 manager.go:1196] Started watching for new ooms in manager Apr 24 19:07:15.849894 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849726 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 19:07:15.849894 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:15.849851 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-23.ec2.internal\" not found" Apr 24 19:07:15.849894 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849852 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 24 19:07:15.849894 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.849873 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 24 19:07:15.850510 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.850493 2568 manager.go:319] Starting recovery of all containers Apr 24 19:07:15.851355 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.851337 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:15.853816 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:15.853790 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-23.ec2.internal\" not found" node="ip-10-0-129-23.ec2.internal" Apr 24 19:07:15.858132 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.858115 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-23.ec2.internal" not found Apr 24 19:07:15.860280 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.860130 2568 manager.go:324] Recovery completed Apr 24 19:07:15.864242 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.864229 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:15.866011 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.865994 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:15.866078 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.866025 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:15.866078 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.866037 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:15.866465 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.866452 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 19:07:15.866465 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.866463 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 19:07:15.866541 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.866478 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:07:15.868830 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.868820 2568 policy_none.go:49] "None policy: Start" Apr 24 19:07:15.868873 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.868834 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 19:07:15.868873 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.868844 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 24 19:07:15.907348 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.907333 2568 manager.go:341] "Starting Device Plugin manager" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:15.907362 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.907371 2568 server.go:85] "Starting device plugin registration server" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.907732 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.907744 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.907905 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.907981 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.907987 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:15.908504 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:15.908542 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-23.ec2.internal\" not found" Apr 24 19:07:15.925115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.918654 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-23.ec2.internal" not found Apr 24 19:07:15.977410 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.977342 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 19:07:15.978489 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.978463 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 19:07:15.978489 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.978490 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 19:07:15.978632 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.978510 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 19:07:15.978632 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.978520 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 19:07:15.978632 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:15.978559 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 19:07:15.981851 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:15.981831 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:16.008781 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.008767 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:16.009504 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.009489 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:16.009574 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.009517 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:16.009574 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.009531 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:16.009574 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.009557 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.018746 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.018730 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.018799 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.018756 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-23.ec2.internal\": node \"ip-10-0-129-23.ec2.internal\" not found" Apr 24 19:07:16.079964 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.079931 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal"] Apr 24 19:07:16.082887 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.082871 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.082980 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.082879 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.106506 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.106491 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.110857 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.110843 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.122706 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.122689 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:07:16.124806 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.124794 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:07:16.251595 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.251513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8067dec5c6d2a3df6f2c71b42f411e30-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal\" (UID: \"8067dec5c6d2a3df6f2c71b42f411e30\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.251595 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.251541 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8067dec5c6d2a3df6f2c71b42f411e30-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal\" (UID: \"8067dec5c6d2a3df6f2c71b42f411e30\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.251595 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.251558 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5617684810147cab138b7763a579ba59-config\") pod \"kube-apiserver-proxy-ip-10-0-129-23.ec2.internal\" (UID: \"5617684810147cab138b7763a579ba59\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.352140 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.352113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8067dec5c6d2a3df6f2c71b42f411e30-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal\" (UID: \"8067dec5c6d2a3df6f2c71b42f411e30\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.352283 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.352143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8067dec5c6d2a3df6f2c71b42f411e30-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal\" (UID: \"8067dec5c6d2a3df6f2c71b42f411e30\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.352283 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.352160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5617684810147cab138b7763a579ba59-config\") pod \"kube-apiserver-proxy-ip-10-0-129-23.ec2.internal\" (UID: \"5617684810147cab138b7763a579ba59\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.352283 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.352186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5617684810147cab138b7763a579ba59-config\") pod \"kube-apiserver-proxy-ip-10-0-129-23.ec2.internal\" (UID: \"5617684810147cab138b7763a579ba59\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.352283 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.352206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8067dec5c6d2a3df6f2c71b42f411e30-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal\" (UID: \"8067dec5c6d2a3df6f2c71b42f411e30\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.352283 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.352216 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8067dec5c6d2a3df6f2c71b42f411e30-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal\" (UID: \"8067dec5c6d2a3df6f2c71b42f411e30\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.425301 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.425264 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.428040 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.428022 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" Apr 24 19:07:16.744816 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.744748 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 19:07:16.745266 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.744892 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:07:16.745266 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.744914 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:07:16.745266 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.744925 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:07:16.822219 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.822195 2568 apiserver.go:52] "Watching apiserver" Apr 24 19:07:16.828766 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.828744 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 19:07:16.829627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.829597 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 19:02:15 +0000 UTC" deadline="2027-11-13 01:46:29.225929464 +0000 UTC" Apr 24 19:07:16.829690 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.829627 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13614h39m12.396305075s" Apr 24 19:07:16.830901 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.830879 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-4f9zh","openshift-cluster-node-tuning-operator/tuned-lppw8","openshift-dns/node-resolver-m6mmt","openshift-image-registry/node-ca-vdgfs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal","openshift-multus/multus-lx4cb","openshift-network-diagnostics/network-check-target-b49hn","kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx","openshift-multus/multus-additional-cni-plugins-qxnnz","openshift-multus/network-metrics-daemon-4lz47","openshift-network-operator/iptables-alerter-b2kgr","openshift-ovn-kubernetes/ovnkube-node-tj77d"] Apr 24 19:07:16.833660 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.833633 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:16.834919 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.834904 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.836014 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.835997 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mzd8w\"" Apr 24 19:07:16.836098 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.836019 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 19:07:16.836098 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.836026 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 19:07:16.836221 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.836205 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.837063 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.837042 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:16.837063 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.837060 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:16.837213 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.837198 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xkz9p\"" Apr 24 19:07:16.837483 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.837468 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.838217 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.838200 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 19:07:16.838319 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.838280 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jf8c5\"" Apr 24 19:07:16.838387 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.838376 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 19:07:16.838549 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.838534 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.838655 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.838621 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:16.838733 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.838686 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:16.839567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.839548 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-52zwb\"" Apr 24 19:07:16.840084 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.840062 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 19:07:16.840181 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.840162 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 19:07:16.841276 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.840599 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 19:07:16.841276 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.840733 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.841276 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.841106 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 19:07:16.841445 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.841373 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xcndp\"" Apr 24 19:07:16.841993 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.841693 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 19:07:16.841993 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.841897 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 19:07:16.842150 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.842102 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 19:07:16.842831 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.842812 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 19:07:16.842940 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.842884 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 19:07:16.843447 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.843428 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.843657 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.843634 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lgxvb\"" Apr 24 19:07:16.843738 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.843643 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 19:07:16.844639 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.844624 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:16.844729 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.844692 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:16.845913 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.845885 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 19:07:16.846059 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.846018 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.846354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.846243 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 19:07:16.846536 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.846522 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cpffq\"" Apr 24 19:07:16.847868 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.847850 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.848225 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.848038 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 19:07:16.848225 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.848039 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pgvc8\"" Apr 24 19:07:16.848368 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.848299 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 19:07:16.848517 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.848497 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:16.848629 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.848517 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:16.850574 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.850557 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 19:07:16.850915 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.850895 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 19:07:16.851010 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.850916 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 19:07:16.851010 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.850987 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 19:07:16.851010 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.850992 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 19:07:16.851166 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.851148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sd4tx\"" Apr 24 19:07:16.851216 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.851165 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 19:07:16.851267 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.851246 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 19:07:16.854502 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-cni-bin\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.854590 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854508 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-sys-fs\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.854590 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pv8p\" (UniqueName: \"kubernetes.io/projected/1186bd80-4999-47f8-b309-3246becab924-kube-api-access-5pv8p\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.854590 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:16.854590 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-log-socket\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.854723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854595 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d49788ac-b5cf-4dfb-9670-2385671fc731-ovn-node-metrics-cert\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.854723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854617 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysctl-d\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.854723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysctl-conf\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.854723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854648 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e21e827-de03-48d5-b7ca-3a5a1c529873-tmp-dir\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.854723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854681 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-ovn\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.854723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854709 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-systemd\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-run\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854747 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-hostroot\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-var-lib-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854792 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f9a2db2-5738-4b14-a835-27706918a96e-multus-daemon-config\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-multus-certs\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854845 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9-konnectivity-ca\") pod \"konnectivity-agent-4f9zh\" (UID: \"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9\") " pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854858 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-var-lib-kubelet\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.854905 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9-agent-certs\") pod \"konnectivity-agent-4f9zh\" (UID: \"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9\") " pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854916 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef8e461f-b2c2-42d8-9ae0-451164801b2f-tmp\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854933 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-netns\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854947 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-slash\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.854989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-socket-dir-parent\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855005 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6923e9e9-0a10-445f-9824-663ad232ab97-iptables-alerter-script\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855028 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysconfig\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855041 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-host\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9w7\" (UniqueName: \"kubernetes.io/projected/ef8e461f-b2c2-42d8-9ae0-451164801b2f-kube-api-access-lq9w7\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855100 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xm9k\" (UniqueName: \"kubernetes.io/projected/3f9a2db2-5738-4b14-a835-27706918a96e-kube-api-access-2xm9k\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855125 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-registration-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-system-cni-dir\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-os-release\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6923e9e9-0a10-445f-9824-663ad232ab97-host-slash\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.855246 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855229 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-device-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855253 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbfj\" (UniqueName: \"kubernetes.io/projected/1cb59573-7573-4947-ac01-0812a566ca34-kube-api-access-hpbfj\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-cni-binary-copy\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855288 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24f2\" (UniqueName: \"kubernetes.io/projected/060d8b4b-7fbe-4109-888d-a5c4822cff6e-kube-api-access-q24f2\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-lib-modules\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855318 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e21e827-de03-48d5-b7ca-3a5a1c529873-hosts-file\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855336 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-os-release\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-cni-multus\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-kubelet\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-env-overrides\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-ovnkube-script-lib\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855412 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-cnibin\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855449 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f9a2db2-5738-4b14-a835-27706918a96e-cni-binary-copy\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-conf-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-socket-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855516 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4be73708-29e9-4ed5-856c-a07616631d8e-host\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.855720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855567 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4be73708-29e9-4ed5-856c-a07616631d8e-serviceca\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855588 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6hh6\" (UniqueName: \"kubernetes.io/projected/4be73708-29e9-4ed5-856c-a07616631d8e-kube-api-access-b6hh6\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855629 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-systemd-units\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855644 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-k8s-cni-cncf-io\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855660 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-cni-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-cnibin\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855730 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855749 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855777 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-run-netns\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855797 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-systemd\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-kubelet\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855831 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-cni-bin\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855851 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-ovnkube-config\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqr94\" (UniqueName: \"kubernetes.io/projected/d49788ac-b5cf-4dfb-9670-2385671fc731-kube-api-access-xqr94\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855884 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-kubernetes\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.856188 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855898 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9422v\" (UniqueName: \"kubernetes.io/projected/1e21e827-de03-48d5-b7ca-3a5a1c529873-kube-api-access-9422v\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855911 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-system-cni-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855927 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-etc-kubernetes\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-etc-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.855988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-node-log\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856002 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-cni-netd\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856017 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-modprobe-d\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856055 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-etc-selinux\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856080 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf4k\" (UniqueName: \"kubernetes.io/projected/6923e9e9-0a10-445f-9824-663ad232ab97-kube-api-access-kcf4k\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-run-ovn-kubernetes\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856124 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-sys\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.856612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.856138 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-tuned\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.857212 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.857196 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:07:16.877545 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.877524 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t4d5n" Apr 24 19:07:16.885617 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.885601 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t4d5n" Apr 24 19:07:16.916644 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:16.916608 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8067dec5c6d2a3df6f2c71b42f411e30.slice/crio-14b56f67ae7702762b006df0e40d78112786a54e8d2662258619f62f35d27787 WatchSource:0}: Error finding container 14b56f67ae7702762b006df0e40d78112786a54e8d2662258619f62f35d27787: Status 404 returned error can't find the container with id 14b56f67ae7702762b006df0e40d78112786a54e8d2662258619f62f35d27787 Apr 24 19:07:16.917188 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:16.917166 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5617684810147cab138b7763a579ba59.slice/crio-2d33db6d60ed51ccc1a1208ed945150b99881aef236c636b1b302fa5d25f43a3 WatchSource:0}: Error finding container 2d33db6d60ed51ccc1a1208ed945150b99881aef236c636b1b302fa5d25f43a3: Status 404 returned error can't find the container with id 2d33db6d60ed51ccc1a1208ed945150b99881aef236c636b1b302fa5d25f43a3 Apr 24 19:07:16.924539 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.924236 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:07:16.956537 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956518 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-var-lib-kubelet\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.956614 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9-agent-certs\") pod \"konnectivity-agent-4f9zh\" (UID: \"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9\") " pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:16.956614 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef8e461f-b2c2-42d8-9ae0-451164801b2f-tmp\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.956614 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956575 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-netns\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.956614 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-slash\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956627 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-socket-dir-parent\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956653 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6923e9e9-0a10-445f-9824-663ad232ab97-iptables-alerter-script\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956658 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-var-lib-kubelet\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956663 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-slash\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956676 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysconfig\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956703 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-netns\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-host\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956719 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysconfig\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-host\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.956775 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-socket-dir-parent\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9w7\" (UniqueName: \"kubernetes.io/projected/ef8e461f-b2c2-42d8-9ae0-451164801b2f-kube-api-access-lq9w7\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xm9k\" (UniqueName: \"kubernetes.io/projected/3f9a2db2-5738-4b14-a835-27706918a96e-kube-api-access-2xm9k\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-registration-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956876 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-system-cni-dir\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956924 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-os-release\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6923e9e9-0a10-445f-9824-663ad232ab97-host-slash\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956973 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-registration-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.956980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-device-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbfj\" (UniqueName: \"kubernetes.io/projected/1cb59573-7573-4947-ac01-0812a566ca34-kube-api-access-hpbfj\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957031 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6923e9e9-0a10-445f-9824-663ad232ab97-host-slash\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957041 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-os-release\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957081 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-system-cni-dir\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957045 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-cni-binary-copy\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957095 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-device-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.957230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q24f2\" (UniqueName: \"kubernetes.io/projected/060d8b4b-7fbe-4109-888d-a5c4822cff6e-kube-api-access-q24f2\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957150 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-lib-modules\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e21e827-de03-48d5-b7ca-3a5a1c529873-hosts-file\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-os-release\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957224 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-cni-multus\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957248 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-kubelet\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-env-overrides\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6923e9e9-0a10-445f-9824-663ad232ab97-iptables-alerter-script\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957296 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-ovnkube-script-lib\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957304 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-os-release\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957309 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-lib-modules\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-cnibin\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957347 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e21e827-de03-48d5-b7ca-3a5a1c529873-hosts-file\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f9a2db2-5738-4b14-a835-27706918a96e-cni-binary-copy\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957390 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-kubelet\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957424 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-conf-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957451 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-cni-multus\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957391 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-conf-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-socket-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957514 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-cnibin\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4be73708-29e9-4ed5-856c-a07616631d8e-host\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4be73708-29e9-4ed5-856c-a07616631d8e-serviceca\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6hh6\" (UniqueName: \"kubernetes.io/projected/4be73708-29e9-4ed5-856c-a07616631d8e-kube-api-access-b6hh6\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957587 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-cni-binary-copy\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957599 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-socket-dir\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957640 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-systemd-units\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957668 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-k8s-cni-cncf-io\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-cni-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957759 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-cnibin\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-run-netns\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957865 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-systemd\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.958879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957889 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-kubelet\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957911 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-cni-bin\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f9a2db2-5738-4b14-a835-27706918a96e-cni-binary-copy\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957937 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-ovnkube-config\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957983 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqr94\" (UniqueName: \"kubernetes.io/projected/d49788ac-b5cf-4dfb-9670-2385671fc731-kube-api-access-xqr94\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957998 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-systemd\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-env-overrides\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-kubernetes\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-k8s-cni-cncf-io\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9422v\" (UniqueName: \"kubernetes.io/projected/1e21e827-de03-48d5-b7ca-3a5a1c529873-kube-api-access-9422v\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4be73708-29e9-4ed5-856c-a07616631d8e-host\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958063 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-system-cni-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958067 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-systemd-units\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-etc-kubernetes\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958110 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-kubelet\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958128 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-etc-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-cni-bin\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958155 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-node-log\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.959665 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958181 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-cni-netd\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958235 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-modprobe-d\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958252 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-multus-cni-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958260 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-etc-selinux\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf4k\" (UniqueName: \"kubernetes.io/projected/6923e9e9-0a10-445f-9824-663ad232ab97-kube-api-access-kcf4k\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-run-ovn-kubernetes\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-etc-selinux\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-sys\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958409 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-cni-netd\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-tuned\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-cni-bin\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958508 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-system-cni-dir\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-modprobe-d\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958544 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-node-log\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958581 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-ovnkube-config\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960388 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958587 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-kubernetes\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958619 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-run-netns\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-sys-fs\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pv8p\" (UniqueName: \"kubernetes.io/projected/1186bd80-4999-47f8-b309-3246becab924-kube-api-access-5pv8p\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958714 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-log-socket\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d49788ac-b5cf-4dfb-9670-2385671fc731-ovn-node-metrics-cert\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysctl-d\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysctl-conf\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958809 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1cb59573-7573-4947-ac01-0812a566ca34-sys-fs\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e21e827-de03-48d5-b7ca-3a5a1c529873-tmp-dir\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1186bd80-4999-47f8-b309-3246becab924-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-etc-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-ovn\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958896 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-ovn\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958903 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-host-run-ovn-kubernetes\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.960871 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958850 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-log-socket\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958659 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-var-lib-cni-bin\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-systemd\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.958974 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-run\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959023 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysctl-d\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959029 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-sysctl-conf\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-systemd\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959079 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-cnibin\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959089 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-hostroot\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-sys\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-var-lib-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959168 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-hostroot\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f9a2db2-5738-4b14-a835-27706918a96e-multus-daemon-config\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-multus-certs\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959225 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef8e461f-b2c2-42d8-9ae0-451164801b2f-run\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959246 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e21e827-de03-48d5-b7ca-3a5a1c529873-tmp-dir\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.961362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-var-lib-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959282 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-host-run-multus-certs\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f9a2db2-5738-4b14-a835-27706918a96e-etc-kubernetes\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959249 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9-konnectivity-ca\") pod \"konnectivity-agent-4f9zh\" (UID: \"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9\") " pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1186bd80-4999-47f8-b309-3246becab924-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d49788ac-b5cf-4dfb-9670-2385671fc731-run-openvswitch\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.959450 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.959522 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:17.459492755 +0000 UTC m=+2.098612594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4be73708-29e9-4ed5-856c-a07616631d8e-serviceca\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.957970 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d49788ac-b5cf-4dfb-9670-2385671fc731-ovnkube-script-lib\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.959919 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9-konnectivity-ca\") pod \"konnectivity-agent-4f9zh\" (UID: \"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9\") " pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.960301 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef8e461f-b2c2-42d8-9ae0-451164801b2f-tmp\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.960340 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f9a2db2-5738-4b14-a835-27706918a96e-multus-daemon-config\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.960574 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9-agent-certs\") pod \"konnectivity-agent-4f9zh\" (UID: \"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9\") " pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.960897 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d49788ac-b5cf-4dfb-9670-2385671fc731-ovn-node-metrics-cert\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.961810 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.961068 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef8e461f-b2c2-42d8-9ae0-451164801b2f-etc-tuned\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.963248 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.963230 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:16.963349 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.963255 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:16.963349 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.963268 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hnhdl for pod openshift-network-diagnostics/network-check-target-b49hn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:16.963453 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:16.963349 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl podName:7c6541c7-0cb3-447d-baaa-7d58f2cba8e2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:17.46331418 +0000 UTC m=+2.102434014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hnhdl" (UniqueName: "kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl") pod "network-check-target-b49hn" (UID: "7c6541c7-0cb3-447d-baaa-7d58f2cba8e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:16.964667 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.964637 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbfj\" (UniqueName: \"kubernetes.io/projected/1cb59573-7573-4947-ac01-0812a566ca34-kube-api-access-hpbfj\") pod \"aws-ebs-csi-driver-node-xqsdx\" (UID: \"1cb59573-7573-4947-ac01-0812a566ca34\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:16.964747 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.964665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xm9k\" (UniqueName: \"kubernetes.io/projected/3f9a2db2-5738-4b14-a835-27706918a96e-kube-api-access-2xm9k\") pod \"multus-lx4cb\" (UID: \"3f9a2db2-5738-4b14-a835-27706918a96e\") " pod="openshift-multus/multus-lx4cb" Apr 24 19:07:16.964797 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.964789 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24f2\" (UniqueName: \"kubernetes.io/projected/060d8b4b-7fbe-4109-888d-a5c4822cff6e-kube-api-access-q24f2\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:16.965069 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.965049 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9w7\" (UniqueName: \"kubernetes.io/projected/ef8e461f-b2c2-42d8-9ae0-451164801b2f-kube-api-access-lq9w7\") pod \"tuned-lppw8\" (UID: \"ef8e461f-b2c2-42d8-9ae0-451164801b2f\") " pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:16.965662 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.965644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6hh6\" (UniqueName: \"kubernetes.io/projected/4be73708-29e9-4ed5-856c-a07616631d8e-kube-api-access-b6hh6\") pod \"node-ca-vdgfs\" (UID: \"4be73708-29e9-4ed5-856c-a07616631d8e\") " pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:16.968274 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.968256 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf4k\" (UniqueName: \"kubernetes.io/projected/6923e9e9-0a10-445f-9824-663ad232ab97-kube-api-access-kcf4k\") pod \"iptables-alerter-b2kgr\" (UID: \"6923e9e9-0a10-445f-9824-663ad232ab97\") " pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:16.971506 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.971482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pv8p\" (UniqueName: \"kubernetes.io/projected/1186bd80-4999-47f8-b309-3246becab924-kube-api-access-5pv8p\") pod \"multus-additional-cni-plugins-qxnnz\" (UID: \"1186bd80-4999-47f8-b309-3246becab924\") " pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:16.972008 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.971992 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9422v\" (UniqueName: \"kubernetes.io/projected/1e21e827-de03-48d5-b7ca-3a5a1c529873-kube-api-access-9422v\") pod \"node-resolver-m6mmt\" (UID: \"1e21e827-de03-48d5-b7ca-3a5a1c529873\") " pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:16.972082 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.972067 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqr94\" (UniqueName: \"kubernetes.io/projected/d49788ac-b5cf-4dfb-9670-2385671fc731-kube-api-access-xqr94\") pod \"ovnkube-node-tj77d\" (UID: \"d49788ac-b5cf-4dfb-9670-2385671fc731\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:16.980864 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.980828 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" event={"ID":"5617684810147cab138b7763a579ba59","Type":"ContainerStarted","Data":"2d33db6d60ed51ccc1a1208ed945150b99881aef236c636b1b302fa5d25f43a3"} Apr 24 19:07:16.981700 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:16.981683 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" event={"ID":"8067dec5c6d2a3df6f2c71b42f411e30","Type":"ContainerStarted","Data":"14b56f67ae7702762b006df0e40d78112786a54e8d2662258619f62f35d27787"} Apr 24 19:07:17.151444 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.151367 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:17.157627 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.157601 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3dcdece_1b7a_4952_8ab6_a6e6dc7089e9.slice/crio-24660d773507fdbfa074c32baa335b93e1bbee6e55767d7df456d3ebed55384d WatchSource:0}: Error finding container 24660d773507fdbfa074c32baa335b93e1bbee6e55767d7df456d3ebed55384d: Status 404 returned error can't find the container with id 24660d773507fdbfa074c32baa335b93e1bbee6e55767d7df456d3ebed55384d Apr 24 19:07:17.166422 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.166402 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lppw8" Apr 24 19:07:17.172616 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.172593 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef8e461f_b2c2_42d8_9ae0_451164801b2f.slice/crio-48d29060a161a1f2e69ac6550c3c1a86f2d34607292ab451de512366aa0eceb9 WatchSource:0}: Error finding container 48d29060a161a1f2e69ac6550c3c1a86f2d34607292ab451de512366aa0eceb9: Status 404 returned error can't find the container with id 48d29060a161a1f2e69ac6550c3c1a86f2d34607292ab451de512366aa0eceb9 Apr 24 19:07:17.181097 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.181078 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m6mmt" Apr 24 19:07:17.186330 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.186312 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e21e827_de03_48d5_b7ca_3a5a1c529873.slice/crio-fd9a9c9b0aba3c24d29a72355c6db74713ce97518e603686eaebcd7d56266334 WatchSource:0}: Error finding container fd9a9c9b0aba3c24d29a72355c6db74713ce97518e603686eaebcd7d56266334: Status 404 returned error can't find the container with id fd9a9c9b0aba3c24d29a72355c6db74713ce97518e603686eaebcd7d56266334 Apr 24 19:07:17.194314 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.194299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vdgfs" Apr 24 19:07:17.199366 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.199344 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be73708_29e9_4ed5_856c_a07616631d8e.slice/crio-889f8fcd613419f9f23baa984beec798ee38045b1198dd2457c7b14000c985b8 WatchSource:0}: Error finding container 889f8fcd613419f9f23baa984beec798ee38045b1198dd2457c7b14000c985b8: Status 404 returned error can't find the container with id 889f8fcd613419f9f23baa984beec798ee38045b1198dd2457c7b14000c985b8 Apr 24 19:07:17.213418 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.213398 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lx4cb" Apr 24 19:07:17.218285 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.218266 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9a2db2_5738_4b14_a835_27706918a96e.slice/crio-7f8c59783819d6b85f86b35ec602b53ebf4afbde98e7ed9d99cf91708ef85d76 WatchSource:0}: Error finding container 7f8c59783819d6b85f86b35ec602b53ebf4afbde98e7ed9d99cf91708ef85d76: Status 404 returned error can't find the container with id 7f8c59783819d6b85f86b35ec602b53ebf4afbde98e7ed9d99cf91708ef85d76 Apr 24 19:07:17.222002 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.221986 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" Apr 24 19:07:17.228841 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.228821 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb59573_7573_4947_ac01_0812a566ca34.slice/crio-06501cc33f89a55ba037bd5ad5a7d59e4f323c230ccdc31afe4c704c0fefaca2 WatchSource:0}: Error finding container 06501cc33f89a55ba037bd5ad5a7d59e4f323c230ccdc31afe4c704c0fefaca2: Status 404 returned error can't find the container with id 06501cc33f89a55ba037bd5ad5a7d59e4f323c230ccdc31afe4c704c0fefaca2 Apr 24 19:07:17.228933 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.228896 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" Apr 24 19:07:17.234615 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.234597 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b2kgr" Apr 24 19:07:17.234828 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.234655 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1186bd80_4999_47f8_b309_3246becab924.slice/crio-e8af489bec9ccb75316ae11c7491843325b831c9833931701adfba0731ec854e WatchSource:0}: Error finding container e8af489bec9ccb75316ae11c7491843325b831c9833931701adfba0731ec854e: Status 404 returned error can't find the container with id e8af489bec9ccb75316ae11c7491843325b831c9833931701adfba0731ec854e Apr 24 19:07:17.240380 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.240349 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:17.240823 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.240739 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6923e9e9_0a10_445f_9824_663ad232ab97.slice/crio-0b415e89153d055050ad6d2f8f18e35a66ef785d43054d7a71bf68efe94fab98 WatchSource:0}: Error finding container 0b415e89153d055050ad6d2f8f18e35a66ef785d43054d7a71bf68efe94fab98: Status 404 returned error can't find the container with id 0b415e89153d055050ad6d2f8f18e35a66ef785d43054d7a71bf68efe94fab98 Apr 24 19:07:17.247343 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:17.247323 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd49788ac_b5cf_4dfb_9670_2385671fc731.slice/crio-d4104b13ce486313c51c3c1688cfbaf801566803fc27e215e79590a83fd90297 WatchSource:0}: Error finding container d4104b13ce486313c51c3c1688cfbaf801566803fc27e215e79590a83fd90297: Status 404 returned error can't find the container with id d4104b13ce486313c51c3c1688cfbaf801566803fc27e215e79590a83fd90297 Apr 24 19:07:17.463391 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.463311 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:17.463536 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:17.463464 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:17.463536 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:17.463524 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.46350461 +0000 UTC m=+3.102624437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:17.564005 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.563968 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:17.564227 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:17.564162 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:17.564227 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:17.564182 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:17.564227 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:17.564196 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hnhdl for pod openshift-network-diagnostics/network-check-target-b49hn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:17.564386 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:17.564251 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl podName:7c6541c7-0cb3-447d-baaa-7d58f2cba8e2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.564233436 +0000 UTC m=+3.203353275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hnhdl" (UniqueName: "kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl") pod "network-check-target-b49hn" (UID: "7c6541c7-0cb3-447d-baaa-7d58f2cba8e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:17.692124 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.692092 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:17.817303 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.817222 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:17.886603 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.886559 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:02:16 +0000 UTC" deadline="2027-10-26 04:37:52.818322592 +0000 UTC" Apr 24 19:07:17.886603 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.886593 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13185h30m34.931733875s" Apr 24 19:07:17.988344 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:17.988248 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:17.988526 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:17.988363 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:18.002072 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.002039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" event={"ID":"1cb59573-7573-4947-ac01-0812a566ca34","Type":"ContainerStarted","Data":"06501cc33f89a55ba037bd5ad5a7d59e4f323c230ccdc31afe4c704c0fefaca2"} Apr 24 19:07:18.019002 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.018913 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vdgfs" event={"ID":"4be73708-29e9-4ed5-856c-a07616631d8e","Type":"ContainerStarted","Data":"889f8fcd613419f9f23baa984beec798ee38045b1198dd2457c7b14000c985b8"} Apr 24 19:07:18.030250 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.030143 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"d4104b13ce486313c51c3c1688cfbaf801566803fc27e215e79590a83fd90297"} Apr 24 19:07:18.045126 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.045038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b2kgr" event={"ID":"6923e9e9-0a10-445f-9824-663ad232ab97","Type":"ContainerStarted","Data":"0b415e89153d055050ad6d2f8f18e35a66ef785d43054d7a71bf68efe94fab98"} Apr 24 19:07:18.054650 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.054571 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerStarted","Data":"e8af489bec9ccb75316ae11c7491843325b831c9833931701adfba0731ec854e"} Apr 24 19:07:18.060915 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.060886 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lx4cb" event={"ID":"3f9a2db2-5738-4b14-a835-27706918a96e","Type":"ContainerStarted","Data":"7f8c59783819d6b85f86b35ec602b53ebf4afbde98e7ed9d99cf91708ef85d76"} Apr 24 19:07:18.070357 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.070296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m6mmt" event={"ID":"1e21e827-de03-48d5-b7ca-3a5a1c529873","Type":"ContainerStarted","Data":"fd9a9c9b0aba3c24d29a72355c6db74713ce97518e603686eaebcd7d56266334"} Apr 24 19:07:18.095366 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.095337 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lppw8" event={"ID":"ef8e461f-b2c2-42d8-9ae0-451164801b2f","Type":"ContainerStarted","Data":"48d29060a161a1f2e69ac6550c3c1a86f2d34607292ab451de512366aa0eceb9"} Apr 24 19:07:18.107476 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.107450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4f9zh" event={"ID":"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9","Type":"ContainerStarted","Data":"24660d773507fdbfa074c32baa335b93e1bbee6e55767d7df456d3ebed55384d"} Apr 24 19:07:18.242804 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.242590 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:18.472280 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.472190 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:18.472432 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:18.472350 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:18.472432 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:18.472412 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.472392602 +0000 UTC m=+5.111512423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:18.572575 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.572523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:18.572743 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:18.572684 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:18.572743 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:18.572702 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:18.572743 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:18.572713 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hnhdl for pod openshift-network-diagnostics/network-check-target-b49hn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:18.572900 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:18.572770 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl podName:7c6541c7-0cb3-447d-baaa-7d58f2cba8e2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.572751519 +0000 UTC m=+5.211871339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hnhdl" (UniqueName: "kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl") pod "network-check-target-b49hn" (UID: "7c6541c7-0cb3-447d-baaa-7d58f2cba8e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:18.887594 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.887506 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:02:16 +0000 UTC" deadline="2028-01-01 04:05:23.204470364 +0000 UTC" Apr 24 19:07:18.887594 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.887544 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14792h58m4.316929599s" Apr 24 19:07:18.979699 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:18.979669 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:18.979865 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:18.979794 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:19.984982 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:19.984926 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:19.985459 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:19.985069 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:20.487835 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:20.487189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:20.487835 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:20.487407 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:20.487835 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:20.487470 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:24.487451669 +0000 UTC m=+9.126571494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:20.588581 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:20.588543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:20.588769 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:20.588680 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:20.588769 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:20.588705 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:20.588769 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:20.588718 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hnhdl for pod openshift-network-diagnostics/network-check-target-b49hn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:20.588932 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:20.588781 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl podName:7c6541c7-0cb3-447d-baaa-7d58f2cba8e2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:24.588762551 +0000 UTC m=+9.227882373 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hnhdl" (UniqueName: "kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl") pod "network-check-target-b49hn" (UID: "7c6541c7-0cb3-447d-baaa-7d58f2cba8e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:20.979408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:20.978834 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:20.979408 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:20.978977 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:21.983011 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:21.982919 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:21.983474 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:21.983063 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:22.978732 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:22.978663 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:22.978925 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:22.978815 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:23.979315 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:23.979283 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:23.979775 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:23.979424 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:24.521034 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:24.520472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:24.521034 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:24.520622 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:24.521034 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:24.520683 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:32.520662229 +0000 UTC m=+17.159782054 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:24.621090 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:24.620985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:24.621270 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:24.621174 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:24.621270 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:24.621193 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:24.621270 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:24.621206 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hnhdl for pod openshift-network-diagnostics/network-check-target-b49hn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:24.621270 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:24.621268 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl podName:7c6541c7-0cb3-447d-baaa-7d58f2cba8e2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:32.621250157 +0000 UTC m=+17.260369978 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hnhdl" (UniqueName: "kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl") pod "network-check-target-b49hn" (UID: "7c6541c7-0cb3-447d-baaa-7d58f2cba8e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:24.979534 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:24.979458 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:24.979948 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:24.979589 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:25.980404 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:25.980371 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:25.980815 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:25.980469 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:26.979123 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:26.979091 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:26.979274 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:26.979210 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:27.979013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:27.978978 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:27.979419 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:27.979082 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:28.979172 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:28.979134 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:28.979544 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:28.979248 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:29.978689 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:29.978658 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:29.978856 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:29.978786 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:30.978916 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:30.978885 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:30.979285 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:30.979013 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:31.979682 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:31.979645 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:31.980088 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:31.979782 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:32.580805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:32.580774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:32.581035 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:32.580899 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:32.581035 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:32.580977 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:48.580944903 +0000 UTC m=+33.220064723 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:32.681887 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:32.681855 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:32.682059 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:32.682009 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:32.682059 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:32.682025 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:32.682059 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:32.682034 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hnhdl for pod openshift-network-diagnostics/network-check-target-b49hn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:32.682190 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:32.682093 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl podName:7c6541c7-0cb3-447d-baaa-7d58f2cba8e2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:48.682076198 +0000 UTC m=+33.321196029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hnhdl" (UniqueName: "kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl") pod "network-check-target-b49hn" (UID: "7c6541c7-0cb3-447d-baaa-7d58f2cba8e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:32.979693 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:32.979624 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:32.979933 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:32.979736 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:33.979745 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:33.979713 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:33.979922 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:33.979851 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:34.979115 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:34.979089 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:34.979525 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:34.979206 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:35.146389 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.146353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lx4cb" event={"ID":"3f9a2db2-5738-4b14-a835-27706918a96e","Type":"ContainerStarted","Data":"c788776b94793483c0378d38657809543ac2ff3a4b64fb9539d6f7bddeb8b3c3"} Apr 24 19:07:35.149635 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.149609 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lppw8" event={"ID":"ef8e461f-b2c2-42d8-9ae0-451164801b2f","Type":"ContainerStarted","Data":"45a89a5980132b17a613595cdc16f130cdbd1179930cfff59490fdaf48f06e70"} Apr 24 19:07:35.151344 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.151143 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" event={"ID":"5617684810147cab138b7763a579ba59","Type":"ContainerStarted","Data":"09f4e38e7328d279479fd1993c9da558d92707945b47cd5ff4832e5a48ea5afd"} Apr 24 19:07:35.155836 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.155671 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:07:35.161351 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.161326 2568 generic.go:358] "Generic (PLEG): container finished" podID="d49788ac-b5cf-4dfb-9670-2385671fc731" containerID="e5d2a1c010f91e02f26705805a6b1d6bec8501367f061da9eede66aa418f9a41" exitCode=1 Apr 24 19:07:35.161450 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.161365 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"e03a4d56555d6a9ede0b45f77f549d0c9cdbe0ea5359fb92e75cef791850f530"} Apr 24 19:07:35.161450 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.161405 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"9053f6217a2da563e4ea5a297c57e82fbad98c702c603cbec6e3c775cfae5b9f"} Apr 24 19:07:35.161450 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.161421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"03d45f688b0dd944fb120e3c93d1228fd8744012587865fdf24fdfd3339f154e"} Apr 24 19:07:35.161450 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.161434 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerDied","Data":"e5d2a1c010f91e02f26705805a6b1d6bec8501367f061da9eede66aa418f9a41"} Apr 24 19:07:35.161450 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.161450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"386bcc5715b30de316b089ee4fbed9a0dce355b4b574c1fa0edf6a7869857a38"} Apr 24 19:07:35.163542 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.163496 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lx4cb" podStartSLOduration=1.727401875 podStartE2EDuration="19.163485885s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.219544165 +0000 UTC m=+1.858663984" lastFinishedPulling="2026-04-24 19:07:34.655628159 +0000 UTC m=+19.294747994" observedRunningTime="2026-04-24 19:07:35.16252193 +0000 UTC m=+19.801641796" watchObservedRunningTime="2026-04-24 19:07:35.163485885 +0000 UTC m=+19.802605726" Apr 24 19:07:35.174702 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.174657 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-23.ec2.internal" podStartSLOduration=19.174644107 podStartE2EDuration="19.174644107s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:35.174512602 +0000 UTC m=+19.813632444" watchObservedRunningTime="2026-04-24 19:07:35.174644107 +0000 UTC m=+19.813763949" Apr 24 19:07:35.190091 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.190005 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lppw8" podStartSLOduration=1.747933521 podStartE2EDuration="19.189989248s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.173853038 +0000 UTC m=+1.812972857" lastFinishedPulling="2026-04-24 19:07:34.61590875 +0000 UTC m=+19.255028584" observedRunningTime="2026-04-24 19:07:35.189681865 +0000 UTC m=+19.828801705" watchObservedRunningTime="2026-04-24 19:07:35.189989248 +0000 UTC m=+19.829109092" Apr 24 19:07:35.979564 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:35.979531 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:35.980404 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:35.979655 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:36.164948 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.164912 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" event={"ID":"1cb59573-7573-4947-ac01-0812a566ca34","Type":"ContainerStarted","Data":"67a9ab1841b89718b5d77f4c8ba5ade86ce8c096b1ccd420cff558fc47831a3e"} Apr 24 19:07:36.166280 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.166235 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vdgfs" event={"ID":"4be73708-29e9-4ed5-856c-a07616631d8e","Type":"ContainerStarted","Data":"756f93e69fe4913bd890bc910679317d5376ab60594f26b916dcee93912089aa"} Apr 24 19:07:36.168922 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.168900 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:07:36.169269 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.169244 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"bf3a82edc5067e80090e966eb7389221f2ece390392c891ecb5bf7c7b2e809fc"} Apr 24 19:07:36.170660 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.170632 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b2kgr" event={"ID":"6923e9e9-0a10-445f-9824-663ad232ab97","Type":"ContainerStarted","Data":"db49922061c8098e8425a34733a5fb084db4a58e54d4c300b424dfe258285eb8"} Apr 24 19:07:36.172097 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.172071 2568 generic.go:358] "Generic (PLEG): container finished" podID="1186bd80-4999-47f8-b309-3246becab924" containerID="f4a5f4009c99af21df242fd9d8cbf69fa8565e2ac32b86f60bf355c1f6f52a51" exitCode=0 Apr 24 19:07:36.172200 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.172145 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerDied","Data":"f4a5f4009c99af21df242fd9d8cbf69fa8565e2ac32b86f60bf355c1f6f52a51"} Apr 24 19:07:36.173592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.173570 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m6mmt" event={"ID":"1e21e827-de03-48d5-b7ca-3a5a1c529873","Type":"ContainerStarted","Data":"e32a48e8b8c5adf000d63df393ef5952c6662ecfc65df702fc03f909118ecdac"} Apr 24 19:07:36.175147 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.175123 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4f9zh" event={"ID":"e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9","Type":"ContainerStarted","Data":"d10a51302181b790b4eda7b133f43582f08bdb02c768c75ac8e481794a24eaea"} Apr 24 19:07:36.176635 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.176610 2568 generic.go:358] "Generic (PLEG): container finished" podID="8067dec5c6d2a3df6f2c71b42f411e30" containerID="db150282844cd38a8cf793f413b14c6f0a716862799ade8ca4dae6d7cff76e4f" exitCode=0 Apr 24 19:07:36.176732 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.176694 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" event={"ID":"8067dec5c6d2a3df6f2c71b42f411e30","Type":"ContainerDied","Data":"db150282844cd38a8cf793f413b14c6f0a716862799ade8ca4dae6d7cff76e4f"} Apr 24 19:07:36.193149 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.193104 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vdgfs" podStartSLOduration=2.8052808799999998 podStartE2EDuration="20.19308728s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.200645278 +0000 UTC m=+1.839765101" lastFinishedPulling="2026-04-24 19:07:34.58845167 +0000 UTC m=+19.227571501" observedRunningTime="2026-04-24 19:07:36.180049446 +0000 UTC m=+20.819169288" watchObservedRunningTime="2026-04-24 19:07:36.19308728 +0000 UTC m=+20.832207122" Apr 24 19:07:36.210408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.210366 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m6mmt" podStartSLOduration=2.808932891 podStartE2EDuration="20.210354372s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.187737942 +0000 UTC m=+1.826857761" lastFinishedPulling="2026-04-24 19:07:34.589159417 +0000 UTC m=+19.228279242" observedRunningTime="2026-04-24 19:07:36.210315025 +0000 UTC m=+20.849434866" watchObservedRunningTime="2026-04-24 19:07:36.210354372 +0000 UTC m=+20.849474213" Apr 24 19:07:36.224555 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.224508 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4f9zh" podStartSLOduration=2.794937534 podStartE2EDuration="20.224493871s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.159130728 +0000 UTC m=+1.798250548" lastFinishedPulling="2026-04-24 19:07:34.588687053 +0000 UTC m=+19.227806885" observedRunningTime="2026-04-24 19:07:36.223909943 +0000 UTC m=+20.863029785" watchObservedRunningTime="2026-04-24 19:07:36.224493871 +0000 UTC m=+20.863613724" Apr 24 19:07:36.373613 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.373553 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 19:07:36.919534 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.919431 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T19:07:36.373575565Z","UUID":"c2c3839e-fba4-4dd2-b33c-4f97f145a623","Handler":null,"Name":"","Endpoint":""} Apr 24 19:07:36.922833 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.922810 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 19:07:36.922973 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.922841 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 19:07:36.978731 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:36.978706 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:36.978862 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:36.978809 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:37.181103 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:37.181071 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" event={"ID":"8067dec5c6d2a3df6f2c71b42f411e30","Type":"ContainerStarted","Data":"c3522adb4b86a7e2a615e940b6e1e691dad4bc08e5278a131ba47f9d857d1ce8"} Apr 24 19:07:37.182933 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:37.182843 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" event={"ID":"1cb59573-7573-4947-ac01-0812a566ca34","Type":"ContainerStarted","Data":"20c2548f92282d0a66d78e0a4acb5343181b3914c94c316a05579e9fa2cc36f4"} Apr 24 19:07:37.196226 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:37.196186 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-b2kgr" podStartSLOduration=3.823367003 podStartE2EDuration="21.196132837s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.243580118 +0000 UTC m=+1.882699938" lastFinishedPulling="2026-04-24 19:07:34.616345947 +0000 UTC m=+19.255465772" observedRunningTime="2026-04-24 19:07:36.258415878 +0000 UTC m=+20.897535719" watchObservedRunningTime="2026-04-24 19:07:37.196132837 +0000 UTC m=+21.835252679" Apr 24 19:07:37.196705 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:37.196652 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-23.ec2.internal" podStartSLOduration=21.196639711 podStartE2EDuration="21.196639711s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:37.19582574 +0000 UTC m=+21.834945583" watchObservedRunningTime="2026-04-24 19:07:37.196639711 +0000 UTC m=+21.835759553" Apr 24 19:07:37.979634 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:37.979605 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:37.979788 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:37.979702 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:38.187515 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:38.187489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:07:38.188035 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:38.187864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"18f6e1bba0839cbe56b50baed2c299a17a9c39c65f488feba19487e3dc4ff449"} Apr 24 19:07:38.189908 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:38.189879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" event={"ID":"1cb59573-7573-4947-ac01-0812a566ca34","Type":"ContainerStarted","Data":"c3d94577088027f71e31a0970877579a9290e3c7966c4f7b332ed311d160f3c8"} Apr 24 19:07:38.209242 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:38.209199 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xqsdx" podStartSLOduration=2.339850226 podStartE2EDuration="22.209187901s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.23014225 +0000 UTC m=+1.869262069" lastFinishedPulling="2026-04-24 19:07:37.099479906 +0000 UTC m=+21.738599744" observedRunningTime="2026-04-24 19:07:38.209088655 +0000 UTC m=+22.848208496" watchObservedRunningTime="2026-04-24 19:07:38.209187901 +0000 UTC m=+22.848307720" Apr 24 19:07:38.979028 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:38.978998 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:38.979179 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:38.979094 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:39.375026 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:39.374994 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:39.375740 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:39.375721 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:39.979025 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:39.978995 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:39.979192 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:39.979134 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:40.193239 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:40.193196 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:40.193693 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:40.193674 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4f9zh" Apr 24 19:07:40.979675 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:40.979521 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:40.980114 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:40.979741 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:41.198052 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:41.198025 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:07:41.198358 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:41.198336 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"7fcdd3ceab866603cf7fce4b2267a2afb4a76b2dace5f0eee62b14460c4ae901"} Apr 24 19:07:41.198652 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:41.198632 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:41.198818 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:41.198803 2568 scope.go:117] "RemoveContainer" containerID="e5d2a1c010f91e02f26705805a6b1d6bec8501367f061da9eede66aa418f9a41" Apr 24 19:07:41.200135 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:41.200104 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerStarted","Data":"e9b1d8cb215739e30e8c8bcb4e8306617bcfb71cfc130ff14d14ddee83f34358"} Apr 24 19:07:41.213134 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:41.213115 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:41.979506 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:41.979478 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:41.979661 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:41.979574 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:42.205159 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.205132 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:07:42.205706 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.205489 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" event={"ID":"d49788ac-b5cf-4dfb-9670-2385671fc731","Type":"ContainerStarted","Data":"8f3bd90bf3c035cbc843c134b8dcfac5b336939886b7ce6071d82da46976ad2d"} Apr 24 19:07:42.205706 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.205584 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:07:42.205865 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.205846 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:42.207062 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.207039 2568 generic.go:358] "Generic (PLEG): container finished" podID="1186bd80-4999-47f8-b309-3246becab924" containerID="e9b1d8cb215739e30e8c8bcb4e8306617bcfb71cfc130ff14d14ddee83f34358" exitCode=0 Apr 24 19:07:42.207186 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.207114 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerDied","Data":"e9b1d8cb215739e30e8c8bcb4e8306617bcfb71cfc130ff14d14ddee83f34358"} Apr 24 19:07:42.220010 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.219992 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:42.237475 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.237368 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" podStartSLOduration=8.851107108 podStartE2EDuration="26.237355676s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.248655561 +0000 UTC m=+1.887775381" lastFinishedPulling="2026-04-24 19:07:34.63490413 +0000 UTC m=+19.274023949" observedRunningTime="2026-04-24 19:07:42.235750937 +0000 UTC m=+26.874870777" watchObservedRunningTime="2026-04-24 19:07:42.237355676 +0000 UTC m=+26.876475517" Apr 24 19:07:42.477077 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.476916 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:07:42.635220 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.635182 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b49hn"] Apr 24 19:07:42.635370 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.635306 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:42.635427 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:42.635410 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:42.635898 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.635868 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4lz47"] Apr 24 19:07:42.636021 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:42.635981 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:42.636088 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:42.636070 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:43.979326 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:43.979265 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:43.979326 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:43.979308 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:43.979806 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:43.979397 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:43.979806 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:43.979524 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:44.212283 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:44.212257 2568 generic.go:358] "Generic (PLEG): container finished" podID="1186bd80-4999-47f8-b309-3246becab924" containerID="0dc110991d85279f7c262dd736eff2e6a095c275cf431152c94596fe4fcc7f33" exitCode=0 Apr 24 19:07:44.212414 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:44.212336 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerDied","Data":"0dc110991d85279f7c262dd736eff2e6a095c275cf431152c94596fe4fcc7f33"} Apr 24 19:07:45.979572 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:45.979547 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:45.980020 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:45.979639 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b49hn" podUID="7c6541c7-0cb3-447d-baaa-7d58f2cba8e2" Apr 24 19:07:45.980020 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:45.979716 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:45.980020 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:45.979795 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:07:46.217441 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.217412 2568 generic.go:358] "Generic (PLEG): container finished" podID="1186bd80-4999-47f8-b309-3246becab924" containerID="4fa33216c51ec262ca068e1bfc037b60c716c0794f3d23860916700542538537" exitCode=0 Apr 24 19:07:46.217573 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.217455 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerDied","Data":"4fa33216c51ec262ca068e1bfc037b60c716c0794f3d23860916700542538537"} Apr 24 19:07:46.639593 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.639570 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-23.ec2.internal" event="NodeReady" Apr 24 19:07:46.639686 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.639659 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 19:07:46.689607 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.689552 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zxshv"] Apr 24 19:07:46.725392 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.725371 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jlkwm"] Apr 24 19:07:46.725527 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.725510 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:46.728687 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.728549 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 19:07:46.728872 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.728855 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 19:07:46.728925 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.728899 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tkcsj\"" Apr 24 19:07:46.728987 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.728932 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 19:07:46.747684 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.747663 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zxshv"] Apr 24 19:07:46.747684 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.747682 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jlkwm"] Apr 24 19:07:46.747795 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.747769 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.749901 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.749880 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 19:07:46.750155 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.750140 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 19:07:46.750227 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.750155 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d7j6w\"" Apr 24 19:07:46.788132 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.788109 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:46.788257 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.788150 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9grf6\" (UniqueName: \"kubernetes.io/projected/e703cafc-bfc2-4649-968d-ef6e4318694a-kube-api-access-9grf6\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:46.888843 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.888821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:46.889001 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.888872 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9grf6\" (UniqueName: \"kubernetes.io/projected/e703cafc-bfc2-4649-968d-ef6e4318694a-kube-api-access-9grf6\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:46.889001 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:46.888966 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:46.889117 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:46.889015 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:47.389001409 +0000 UTC m=+32.028121229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:07:46.889117 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.889061 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.889117 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.889106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-tmp-dir\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.889279 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.889133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmfdc\" (UniqueName: \"kubernetes.io/projected/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-kube-api-access-kmfdc\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.889279 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.889163 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-config-volume\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.898996 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.898977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9grf6\" (UniqueName: \"kubernetes.io/projected/e703cafc-bfc2-4649-968d-ef6e4318694a-kube-api-access-9grf6\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:46.989682 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.989614 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.989682 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.989653 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-tmp-dir\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.989682 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.989679 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmfdc\" (UniqueName: \"kubernetes.io/projected/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-kube-api-access-kmfdc\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.990184 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:46.989771 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:46.990184 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.989793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-config-volume\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.990184 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:46.989821 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:47.489803376 +0000 UTC m=+32.128923199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:07:46.990184 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.990009 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-tmp-dir\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:46.990362 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:46.990287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-config-volume\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:47.002540 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.002517 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmfdc\" (UniqueName: \"kubernetes.io/projected/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-kube-api-access-kmfdc\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:47.393109 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.393070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:47.393277 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:47.393230 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:47.393341 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:47.393319 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:48.393298441 +0000 UTC m=+33.032418263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:07:47.493737 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.493700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:47.493871 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:47.493815 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:47.493871 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:47.493869 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:48.493853259 +0000 UTC m=+33.132973079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:07:47.979034 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.978995 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:47.979034 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.979033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:47.982521 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.982501 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-72mrf\"" Apr 24 19:07:47.982521 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.982510 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:07:47.982681 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.982537 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:07:47.982681 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.982501 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h6pgv\"" Apr 24 19:07:47.982681 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:47.982501 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:07:48.400706 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:48.400671 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:48.401303 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:48.400822 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:48.401303 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:48.400889 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:50.400873237 +0000 UTC m=+35.039993056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:07:48.501738 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:48.501703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:48.501895 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:48.501829 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:48.501948 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:48.501904 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:50.501883973 +0000 UTC m=+35.141003798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:07:48.602714 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:48.602678 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:07:48.602884 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:48.602847 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:07:48.602973 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:48.602945 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:08:20.602925404 +0000 UTC m=+65.242045240 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : secret "metrics-daemon-secret" not found Apr 24 19:07:48.703161 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:48.703080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:48.715080 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:48.715051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhdl\" (UniqueName: \"kubernetes.io/projected/7c6541c7-0cb3-447d-baaa-7d58f2cba8e2-kube-api-access-hnhdl\") pod \"network-check-target-b49hn\" (UID: \"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2\") " pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:48.897879 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:48.897849 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:49.085343 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:49.085310 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b49hn"] Apr 24 19:07:49.091519 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:07:49.091489 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6541c7_0cb3_447d_baaa_7d58f2cba8e2.slice/crio-1809b488ee0a83546f6af573bdd5057b8ff938bc393344e80b42ec49641989eb WatchSource:0}: Error finding container 1809b488ee0a83546f6af573bdd5057b8ff938bc393344e80b42ec49641989eb: Status 404 returned error can't find the container with id 1809b488ee0a83546f6af573bdd5057b8ff938bc393344e80b42ec49641989eb Apr 24 19:07:49.224884 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:49.224851 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b49hn" event={"ID":"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2","Type":"ContainerStarted","Data":"1809b488ee0a83546f6af573bdd5057b8ff938bc393344e80b42ec49641989eb"} Apr 24 19:07:50.416125 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:50.415484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:50.416125 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:50.415680 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:50.416125 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:50.415746 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:54.415725748 +0000 UTC m=+39.054845590 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:07:50.517160 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:50.516868 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:50.517160 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:50.517029 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:50.517160 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:50.517105 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:54.517085289 +0000 UTC m=+39.156205116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:07:54.236881 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:54.236654 2568 generic.go:358] "Generic (PLEG): container finished" podID="1186bd80-4999-47f8-b309-3246becab924" containerID="d4c8de8b0525bbb1a9d97c4f7853e0234a848cfbc714e31a93f1a6a5f5e2fd58" exitCode=0 Apr 24 19:07:54.237340 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:54.236733 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerDied","Data":"d4c8de8b0525bbb1a9d97c4f7853e0234a848cfbc714e31a93f1a6a5f5e2fd58"} Apr 24 19:07:54.238546 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:54.238519 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b49hn" event={"ID":"7c6541c7-0cb3-447d-baaa-7d58f2cba8e2","Type":"ContainerStarted","Data":"4e0dd9df3401d470182579755b4091688e1612348b8ba063ccb32701138590db"} Apr 24 19:07:54.238690 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:54.238676 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:07:54.312142 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:54.312093 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b49hn" podStartSLOduration=34.014347425 podStartE2EDuration="38.31207621s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:49.093436685 +0000 UTC m=+33.732556505" lastFinishedPulling="2026-04-24 19:07:53.391165456 +0000 UTC m=+38.030285290" observedRunningTime="2026-04-24 19:07:54.311812154 +0000 UTC m=+38.950931998" watchObservedRunningTime="2026-04-24 19:07:54.31207621 +0000 UTC m=+38.951196051" Apr 24 19:07:54.444030 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:54.444005 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:07:54.444316 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:54.444162 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:54.444316 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:54.444237 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:08:02.444217339 +0000 UTC m=+47.083337170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:07:54.544932 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:54.544879 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:07:54.545034 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:54.545003 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:54.545080 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:07:54.545049 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:02.545034663 +0000 UTC m=+47.184154483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:07:55.243306 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:55.243277 2568 generic.go:358] "Generic (PLEG): container finished" podID="1186bd80-4999-47f8-b309-3246becab924" containerID="e59ba4899bf2a6a9ccd097eee70ebd2d618bfd81e88d20a0de637b1ba9f0c6e9" exitCode=0 Apr 24 19:07:55.243666 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:55.243357 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerDied","Data":"e59ba4899bf2a6a9ccd097eee70ebd2d618bfd81e88d20a0de637b1ba9f0c6e9"} Apr 24 19:07:56.248057 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:56.248019 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" event={"ID":"1186bd80-4999-47f8-b309-3246becab924","Type":"ContainerStarted","Data":"3b34fe27655c518d28e7dfa975b59c8891ba75a376d8fba305c133199cb3d4f7"} Apr 24 19:07:56.269177 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:07:56.269123 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qxnnz" podStartSLOduration=4.108487401 podStartE2EDuration="40.26911043s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:17.235892159 +0000 UTC m=+1.875011978" lastFinishedPulling="2026-04-24 19:07:53.396515171 +0000 UTC m=+38.035635007" observedRunningTime="2026-04-24 19:07:56.268094665 +0000 UTC m=+40.907214525" watchObservedRunningTime="2026-04-24 19:07:56.26911043 +0000 UTC m=+40.908230272" Apr 24 19:08:02.491400 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:02.491361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:08:02.491947 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:02.491483 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:08:02.491947 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:02.491541 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:08:18.491526158 +0000 UTC m=+63.130645977 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:08:02.592389 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:02.592360 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:08:02.592482 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:02.592467 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:08:02.592535 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:02.592524 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:18.592507386 +0000 UTC m=+63.231627210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:08:14.229722 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:14.229696 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj77d" Apr 24 19:08:18.589930 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:18.589890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:08:18.590441 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:18.590044 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:08:18.590441 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:18.590118 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:08:50.590102206 +0000 UTC m=+95.229222025 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:08:18.691058 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:18.691033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:08:18.691149 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:18.691132 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:08:18.691199 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:18.691187 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:50.691174956 +0000 UTC m=+95.330294774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:08:20.702379 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:20.702320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:08:20.702900 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:20.702498 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:08:20.702900 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:20.702593 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:09:24.702568737 +0000 UTC m=+129.341688556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : secret "metrics-daemon-secret" not found Apr 24 19:08:25.246345 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:25.246315 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b49hn" Apr 24 19:08:50.602243 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:50.602201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:08:50.602682 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:50.602328 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:08:50.602682 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:50.602385 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert podName:e703cafc-bfc2-4649-968d-ef6e4318694a nodeName:}" failed. No retries permitted until 2026-04-24 19:09:54.602370864 +0000 UTC m=+159.241490683 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert") pod "ingress-canary-zxshv" (UID: "e703cafc-bfc2-4649-968d-ef6e4318694a") : secret "canary-serving-cert" not found Apr 24 19:08:50.702759 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:08:50.702732 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:08:50.702849 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:50.702840 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:08:50.702911 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:08:50.702900 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls podName:7fae1c6b-197d-49e4-a9eb-9b922eaa6f48 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:54.702886228 +0000 UTC m=+159.342006052 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls") pod "dns-default-jlkwm" (UID: "7fae1c6b-197d-49e4-a9eb-9b922eaa6f48") : secret "dns-default-metrics-tls" not found Apr 24 19:09:14.597413 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.597372 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh"] Apr 24 19:09:14.600199 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.600179 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.603139 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.603109 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 19:09:14.604227 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.604197 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 19:09:14.604512 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.604483 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vj44j\"" Apr 24 19:09:14.604622 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.604607 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:09:14.606584 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.604688 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 19:09:14.619812 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.619783 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh"] Apr 24 19:09:14.641374 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.641347 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q9q9\" (UniqueName: \"kubernetes.io/projected/c4d6f584-a9a2-4297-9d42-6683202fc40f-kube-api-access-9q9q9\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.641466 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.641381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d6f584-a9a2-4297-9d42-6683202fc40f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.641466 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.641408 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d6f584-a9a2-4297-9d42-6683202fc40f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.692586 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.692564 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj"] Apr 24 19:09:14.695627 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.695613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" Apr 24 19:09:14.699166 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.699142 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:09:14.699387 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.699366 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-b6t55\"" Apr 24 19:09:14.699517 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.699502 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 19:09:14.701338 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.701305 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d9pbk"] Apr 24 19:09:14.703938 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.703911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.704933 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.704910 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj"] Apr 24 19:09:14.706223 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.706205 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 19:09:14.706348 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.706237 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 19:09:14.707229 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.707192 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 19:09:14.708275 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.708258 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 19:09:14.710464 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.710430 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-5xcbr\"" Apr 24 19:09:14.719434 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.719411 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d9pbk"] Apr 24 19:09:14.720920 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.720904 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 19:09:14.742204 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d6f584-a9a2-4297-9d42-6683202fc40f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.742290 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwzj\" (UniqueName: \"kubernetes.io/projected/f3634274-2fdb-4eaf-aecc-9c56d2c42a6d-kube-api-access-hqwzj\") pod \"volume-data-source-validator-7c6cbb6c87-7cjvj\" (UID: \"f3634274-2fdb-4eaf-aecc-9c56d2c42a6d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" Apr 24 19:09:14.742290 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-serving-cert\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.742380 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q9q9\" (UniqueName: \"kubernetes.io/projected/c4d6f584-a9a2-4297-9d42-6683202fc40f-kube-api-access-9q9q9\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.742420 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742397 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d6f584-a9a2-4297-9d42-6683202fc40f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.742474 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742416 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-snapshots\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.742474 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742449 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-tmp\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.742567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742472 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.742567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-service-ca-bundle\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.742663 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742569 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqqbj\" (UniqueName: \"kubernetes.io/projected/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-kube-api-access-fqqbj\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.742878 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.742860 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d6f584-a9a2-4297-9d42-6683202fc40f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.745192 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.745177 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d6f584-a9a2-4297-9d42-6683202fc40f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.759455 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.759430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q9q9\" (UniqueName: \"kubernetes.io/projected/c4d6f584-a9a2-4297-9d42-6683202fc40f-kube-api-access-9q9q9\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxjdh\" (UID: \"c4d6f584-a9a2-4297-9d42-6683202fc40f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.806999 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.806972 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg"] Apr 24 19:09:14.809902 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.809885 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:14.812422 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.812402 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 19:09:14.812422 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.812414 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xg5wz\"" Apr 24 19:09:14.812551 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.812515 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:09:14.812667 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.812653 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 19:09:14.821013 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.820996 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg"] Apr 24 19:09:14.843771 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.843746 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-snapshots\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.843864 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.843780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-tmp\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.843864 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.843807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.843864 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.843831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-service-ca-bundle\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.843864 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.843856 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqqbj\" (UniqueName: \"kubernetes.io/projected/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-kube-api-access-fqqbj\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.844085 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.843886 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:14.844085 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.844045 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwzj\" (UniqueName: \"kubernetes.io/projected/f3634274-2fdb-4eaf-aecc-9c56d2c42a6d-kube-api-access-hqwzj\") pod \"volume-data-source-validator-7c6cbb6c87-7cjvj\" (UID: \"f3634274-2fdb-4eaf-aecc-9c56d2c42a6d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" Apr 24 19:09:14.844085 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.844080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-serving-cert\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.844236 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.844164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdxjl\" (UniqueName: \"kubernetes.io/projected/4b91e394-5902-49ee-b5d6-296b79e40e07-kube-api-access-vdxjl\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:14.844422 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.844402 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-snapshots\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.844528 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.844420 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-tmp\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.844996 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.844936 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.845274 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.845254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-service-ca-bundle\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.846585 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.846567 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-serving-cert\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.852098 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.852050 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqqbj\" (UniqueName: \"kubernetes.io/projected/fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa-kube-api-access-fqqbj\") pod \"insights-operator-585dfdc468-d9pbk\" (UID: \"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa\") " pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:14.852653 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.852635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwzj\" (UniqueName: \"kubernetes.io/projected/f3634274-2fdb-4eaf-aecc-9c56d2c42a6d-kube-api-access-hqwzj\") pod \"volume-data-source-validator-7c6cbb6c87-7cjvj\" (UID: \"f3634274-2fdb-4eaf-aecc-9c56d2c42a6d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" Apr 24 19:09:14.911408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.911389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" Apr 24 19:09:14.945510 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.945481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:14.945614 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.945585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdxjl\" (UniqueName: \"kubernetes.io/projected/4b91e394-5902-49ee-b5d6-296b79e40e07-kube-api-access-vdxjl\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:14.945669 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:14.945610 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:09:14.945717 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:14.945674 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls podName:4b91e394-5902-49ee-b5d6-296b79e40e07 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:15.445661032 +0000 UTC m=+120.084780856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-464qg" (UID: "4b91e394-5902-49ee-b5d6-296b79e40e07") : secret "samples-operator-tls" not found Apr 24 19:09:14.956001 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:14.955977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdxjl\" (UniqueName: \"kubernetes.io/projected/4b91e394-5902-49ee-b5d6-296b79e40e07-kube-api-access-vdxjl\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:15.004354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.004327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" Apr 24 19:09:15.013006 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.012986 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d9pbk" Apr 24 19:09:15.020257 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.020234 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh"] Apr 24 19:09:15.023358 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:15.023320 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4d6f584_a9a2_4297_9d42_6683202fc40f.slice/crio-e1003dd232561699d065857c7d805045afe5614b4f932eaa0f5d83fb7711df7f WatchSource:0}: Error finding container e1003dd232561699d065857c7d805045afe5614b4f932eaa0f5d83fb7711df7f: Status 404 returned error can't find the container with id e1003dd232561699d065857c7d805045afe5614b4f932eaa0f5d83fb7711df7f Apr 24 19:09:15.124818 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.124755 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj"] Apr 24 19:09:15.128100 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:15.128069 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3634274_2fdb_4eaf_aecc_9c56d2c42a6d.slice/crio-5d8c5142bab2c3ccb72eda01e00f7a1f9a07ec4c8d446874930cd3b1015d9b51 WatchSource:0}: Error finding container 5d8c5142bab2c3ccb72eda01e00f7a1f9a07ec4c8d446874930cd3b1015d9b51: Status 404 returned error can't find the container with id 5d8c5142bab2c3ccb72eda01e00f7a1f9a07ec4c8d446874930cd3b1015d9b51 Apr 24 19:09:15.140812 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.140792 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d9pbk"] Apr 24 19:09:15.143425 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:15.143404 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc7b8d95_3941_4c11_9c0a_14bfffe3e1fa.slice/crio-6f3668972acd0dd275cbfc65e0a0934579bd4c1b93df67c90b48da86a65c772e WatchSource:0}: Error finding container 6f3668972acd0dd275cbfc65e0a0934579bd4c1b93df67c90b48da86a65c772e: Status 404 returned error can't find the container with id 6f3668972acd0dd275cbfc65e0a0934579bd4c1b93df67c90b48da86a65c772e Apr 24 19:09:15.398607 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.398538 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" event={"ID":"f3634274-2fdb-4eaf-aecc-9c56d2c42a6d","Type":"ContainerStarted","Data":"5d8c5142bab2c3ccb72eda01e00f7a1f9a07ec4c8d446874930cd3b1015d9b51"} Apr 24 19:09:15.399386 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.399366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d9pbk" event={"ID":"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa","Type":"ContainerStarted","Data":"6f3668972acd0dd275cbfc65e0a0934579bd4c1b93df67c90b48da86a65c772e"} Apr 24 19:09:15.400249 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.400230 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" event={"ID":"c4d6f584-a9a2-4297-9d42-6683202fc40f","Type":"ContainerStarted","Data":"e1003dd232561699d065857c7d805045afe5614b4f932eaa0f5d83fb7711df7f"} Apr 24 19:09:15.449835 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.449816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:15.452883 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:15.450186 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:09:15.452883 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:15.450263 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls podName:4b91e394-5902-49ee-b5d6-296b79e40e07 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:16.450243999 +0000 UTC m=+121.089363828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-464qg" (UID: "4b91e394-5902-49ee-b5d6-296b79e40e07") : secret "samples-operator-tls" not found Apr 24 19:09:15.594518 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.594479 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j"] Apr 24 19:09:15.597274 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.597257 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.600415 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.600393 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 19:09:15.600786 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.600667 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 19:09:15.600786 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.600708 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-b7jgc\"" Apr 24 19:09:15.601149 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.601130 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 19:09:15.601555 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.601534 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:09:15.607580 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.607563 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j"] Apr 24 19:09:15.651119 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.651060 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.651119 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.651093 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twz7q\" (UniqueName: \"kubernetes.io/projected/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-kube-api-access-twz7q\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.651119 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.651117 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-config\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.753165 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.752369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.753165 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.752409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twz7q\" (UniqueName: \"kubernetes.io/projected/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-kube-api-access-twz7q\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.753165 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.752444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-config\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.753165 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.753119 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-config\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.755858 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.755817 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.762008 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.761968 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twz7q\" (UniqueName: \"kubernetes.io/projected/44f2135d-60e2-4ac2-9dc5-8f2de4ca429c-kube-api-access-twz7q\") pod \"service-ca-operator-d6fc45fc5-jqb5j\" (UID: \"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:15.910876 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.910644 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-b7jgc\"" Apr 24 19:09:15.918676 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:15.918653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" Apr 24 19:09:16.071276 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:16.071247 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j"] Apr 24 19:09:16.076230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:16.076205 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f2135d_60e2_4ac2_9dc5_8f2de4ca429c.slice/crio-18c4dae130dd83779adfa5c4548232ddf53c4e817d28b87bcef1f66af6ccc5c6 WatchSource:0}: Error finding container 18c4dae130dd83779adfa5c4548232ddf53c4e817d28b87bcef1f66af6ccc5c6: Status 404 returned error can't find the container with id 18c4dae130dd83779adfa5c4548232ddf53c4e817d28b87bcef1f66af6ccc5c6 Apr 24 19:09:16.403921 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:16.403834 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" event={"ID":"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c","Type":"ContainerStarted","Data":"18c4dae130dd83779adfa5c4548232ddf53c4e817d28b87bcef1f66af6ccc5c6"} Apr 24 19:09:16.457812 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:16.457783 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:16.457968 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:16.457931 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:09:16.458024 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:16.458012 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls podName:4b91e394-5902-49ee-b5d6-296b79e40e07 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:18.457995792 +0000 UTC m=+123.097115617 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-464qg" (UID: "4b91e394-5902-49ee-b5d6-296b79e40e07") : secret "samples-operator-tls" not found Apr 24 19:09:17.512420 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.512376 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55d5479bb4-7cwnd"] Apr 24 19:09:17.517162 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.517131 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.523593 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.523555 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 19:09:17.523722 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.523644 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 19:09:17.524055 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.524031 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 19:09:17.524844 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.524548 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kn9zh\"" Apr 24 19:09:17.533396 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.533374 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 19:09:17.550771 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.550750 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55d5479bb4-7cwnd"] Apr 24 19:09:17.566843 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.566817 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-bound-sa-token\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.566995 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.566977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-ca-trust-extracted\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.567079 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.567011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-trusted-ca\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.567134 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.567078 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-certificates\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.567134 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.567118 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.567235 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.567142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66x9q\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-kube-api-access-66x9q\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.567235 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.567195 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-image-registry-private-configuration\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.567235 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.567224 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-installation-pull-secrets\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668252 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-certificates\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668355 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668259 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668355 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66x9q\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-kube-api-access-66x9q\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668425 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:17.668376 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:09:17.668425 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:17.668394 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d5479bb4-7cwnd: secret "image-registry-tls" not found Apr 24 19:09:17.668489 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:17.668465 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls podName:1b4671a4-a2bd-4f33-b18e-eead8bfe79fd nodeName:}" failed. No retries permitted until 2026-04-24 19:09:18.168441182 +0000 UTC m=+122.807561017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls") pod "image-registry-55d5479bb4-7cwnd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd") : secret "image-registry-tls" not found Apr 24 19:09:17.668539 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-image-registry-private-configuration\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668539 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668522 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-installation-pull-secrets\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668613 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-bound-sa-token\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668673 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-ca-trust-extracted\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668726 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-trusted-ca\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.668943 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.668921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-certificates\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.669104 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.669083 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-ca-trust-extracted\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.669464 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.669447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-trusted-ca\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.670725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.670704 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-installation-pull-secrets\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.670790 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.670766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-image-registry-private-configuration\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.677745 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.677724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-bound-sa-token\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:17.677942 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:17.677923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66x9q\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-kube-api-access-66x9q\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:18.172981 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.172858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:18.173142 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:18.173003 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:09:18.173142 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:18.173019 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d5479bb4-7cwnd: secret "image-registry-tls" not found Apr 24 19:09:18.173142 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:18.173072 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls podName:1b4671a4-a2bd-4f33-b18e-eead8bfe79fd nodeName:}" failed. No retries permitted until 2026-04-24 19:09:19.17305442 +0000 UTC m=+123.812174241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls") pod "image-registry-55d5479bb4-7cwnd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd") : secret "image-registry-tls" not found Apr 24 19:09:18.410631 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.410549 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d9pbk" event={"ID":"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa","Type":"ContainerStarted","Data":"36f791330fb1feb7690f22befb3c15aa9f5bbc84b3d326f221fdb07ebaebab27"} Apr 24 19:09:18.411993 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.411965 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" event={"ID":"c4d6f584-a9a2-4297-9d42-6683202fc40f","Type":"ContainerStarted","Data":"e0903b8081a2b1b3cd261404f5c8041d3ed564103de60a044cdc24051fe8736f"} Apr 24 19:09:18.413242 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.413220 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" event={"ID":"f3634274-2fdb-4eaf-aecc-9c56d2c42a6d","Type":"ContainerStarted","Data":"db7e4a5762073cf47aa4574d8cd421cef80d2921799c21bc145c69ca094a139f"} Apr 24 19:09:18.414546 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.414524 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" event={"ID":"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c","Type":"ContainerStarted","Data":"fcff2afec9cf768f43e5eab9747347b9bd7b938c94375b829ace9ca9d74ac876"} Apr 24 19:09:18.432942 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.432892 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-d9pbk" podStartSLOduration=1.4241871019999999 podStartE2EDuration="4.432876916s" podCreationTimestamp="2026-04-24 19:09:14 +0000 UTC" firstStartedPulling="2026-04-24 19:09:15.145292998 +0000 UTC m=+119.784412822" lastFinishedPulling="2026-04-24 19:09:18.15398281 +0000 UTC m=+122.793102636" observedRunningTime="2026-04-24 19:09:18.432484929 +0000 UTC m=+123.071604773" watchObservedRunningTime="2026-04-24 19:09:18.432876916 +0000 UTC m=+123.071996762" Apr 24 19:09:18.452726 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.452681 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7cjvj" podStartSLOduration=1.428404764 podStartE2EDuration="4.452666594s" podCreationTimestamp="2026-04-24 19:09:14 +0000 UTC" firstStartedPulling="2026-04-24 19:09:15.12979072 +0000 UTC m=+119.768910539" lastFinishedPulling="2026-04-24 19:09:18.154052535 +0000 UTC m=+122.793172369" observedRunningTime="2026-04-24 19:09:18.452242155 +0000 UTC m=+123.091361998" watchObservedRunningTime="2026-04-24 19:09:18.452666594 +0000 UTC m=+123.091786436" Apr 24 19:09:18.475248 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.475225 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:18.476309 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:18.475468 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:09:18.476309 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:18.475534 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls podName:4b91e394-5902-49ee-b5d6-296b79e40e07 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:22.475515292 +0000 UTC m=+127.114635114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-464qg" (UID: "4b91e394-5902-49ee-b5d6-296b79e40e07") : secret "samples-operator-tls" not found Apr 24 19:09:18.509453 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.509415 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" podStartSLOduration=1.381278571 podStartE2EDuration="3.509405908s" podCreationTimestamp="2026-04-24 19:09:15 +0000 UTC" firstStartedPulling="2026-04-24 19:09:16.078596982 +0000 UTC m=+120.717716804" lastFinishedPulling="2026-04-24 19:09:18.206724308 +0000 UTC m=+122.845844141" observedRunningTime="2026-04-24 19:09:18.509181239 +0000 UTC m=+123.148301081" watchObservedRunningTime="2026-04-24 19:09:18.509405908 +0000 UTC m=+123.148525749" Apr 24 19:09:18.510035 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:18.510004 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" podStartSLOduration=1.3815471160000001 podStartE2EDuration="4.509997692s" podCreationTimestamp="2026-04-24 19:09:14 +0000 UTC" firstStartedPulling="2026-04-24 19:09:15.025535956 +0000 UTC m=+119.664655778" lastFinishedPulling="2026-04-24 19:09:18.153986535 +0000 UTC m=+122.793106354" observedRunningTime="2026-04-24 19:09:18.486057297 +0000 UTC m=+123.125177139" watchObservedRunningTime="2026-04-24 19:09:18.509997692 +0000 UTC m=+123.149117532" Apr 24 19:09:19.179757 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:19.179727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:19.180146 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:19.179895 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:09:19.180146 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:19.179917 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d5479bb4-7cwnd: secret "image-registry-tls" not found Apr 24 19:09:19.180146 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:19.179997 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls podName:1b4671a4-a2bd-4f33-b18e-eead8bfe79fd nodeName:}" failed. No retries permitted until 2026-04-24 19:09:21.179976989 +0000 UTC m=+125.819096822 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls") pod "image-registry-55d5479bb4-7cwnd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd") : secret "image-registry-tls" not found Apr 24 19:09:21.155118 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:21.155091 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m6mmt_1e21e827-de03-48d5-b7ca-3a5a1c529873/dns-node-resolver/0.log" Apr 24 19:09:21.196505 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:21.196478 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:21.196612 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:21.196577 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:09:21.196612 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:21.196589 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d5479bb4-7cwnd: secret "image-registry-tls" not found Apr 24 19:09:21.196681 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:21.196629 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls podName:1b4671a4-a2bd-4f33-b18e-eead8bfe79fd nodeName:}" failed. No retries permitted until 2026-04-24 19:09:25.196618528 +0000 UTC m=+129.835738346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls") pod "image-registry-55d5479bb4-7cwnd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd") : secret "image-registry-tls" not found Apr 24 19:09:22.128325 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.128297 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-w6spj"] Apr 24 19:09:22.132395 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.132380 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.135011 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.134986 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 19:09:22.135830 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.135813 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bfg6d\"" Apr 24 19:09:22.135928 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.135816 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 19:09:22.135928 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.135874 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 19:09:22.136063 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.136034 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 19:09:22.146034 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.146013 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-w6spj"] Apr 24 19:09:22.170132 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.170113 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vdgfs_4be73708-29e9-4ed5-856c-a07616631d8e/node-ca/0.log" Apr 24 19:09:22.204920 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.204900 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-signing-key\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.205012 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.204925 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-signing-cabundle\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.205012 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.204985 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhcb\" (UniqueName: \"kubernetes.io/projected/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-kube-api-access-hwhcb\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.306272 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.306251 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-signing-key\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.306367 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.306278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-signing-cabundle\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.306367 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.306325 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwhcb\" (UniqueName: \"kubernetes.io/projected/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-kube-api-access-hwhcb\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.306843 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.306822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-signing-cabundle\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.309059 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.309034 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-signing-key\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.315841 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.315822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwhcb\" (UniqueName: \"kubernetes.io/projected/5aff5e99-0a4f-49bc-b933-bfafa6ea0944-kube-api-access-hwhcb\") pod \"service-ca-865cb79987-w6spj\" (UID: \"5aff5e99-0a4f-49bc-b933-bfafa6ea0944\") " pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.440832 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.440782 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-w6spj" Apr 24 19:09:22.513602 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.511164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:22.513602 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:22.511340 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:09:22.513602 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:22.511414 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls podName:4b91e394-5902-49ee-b5d6-296b79e40e07 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:30.511394272 +0000 UTC m=+135.150514095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-464qg" (UID: "4b91e394-5902-49ee-b5d6-296b79e40e07") : secret "samples-operator-tls" not found Apr 24 19:09:22.566155 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:22.566126 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-w6spj"] Apr 24 19:09:22.569421 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:22.569387 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aff5e99_0a4f_49bc_b933_bfafa6ea0944.slice/crio-cfafa4e8b634f10296796f68dd725067dfa55d7931e02597181edfd93966c9b7 WatchSource:0}: Error finding container cfafa4e8b634f10296796f68dd725067dfa55d7931e02597181edfd93966c9b7: Status 404 returned error can't find the container with id cfafa4e8b634f10296796f68dd725067dfa55d7931e02597181edfd93966c9b7 Apr 24 19:09:23.427965 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:23.427917 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-w6spj" event={"ID":"5aff5e99-0a4f-49bc-b933-bfafa6ea0944","Type":"ContainerStarted","Data":"c89ccef894d965bd19c827abaff20961c01badc9c6e18183c8fd0de4012f33dc"} Apr 24 19:09:23.427965 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:23.427967 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-w6spj" event={"ID":"5aff5e99-0a4f-49bc-b933-bfafa6ea0944","Type":"ContainerStarted","Data":"cfafa4e8b634f10296796f68dd725067dfa55d7931e02597181edfd93966c9b7"} Apr 24 19:09:23.447931 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:23.447880 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-w6spj" podStartSLOduration=1.447863732 podStartE2EDuration="1.447863732s" podCreationTimestamp="2026-04-24 19:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:23.446200932 +0000 UTC m=+128.085320774" watchObservedRunningTime="2026-04-24 19:09:23.447863732 +0000 UTC m=+128.086983575" Apr 24 19:09:24.726995 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:24.726945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:09:24.727377 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:24.727071 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:09:24.727377 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:24.727123 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs podName:060d8b4b-7fbe-4109-888d-a5c4822cff6e nodeName:}" failed. No retries permitted until 2026-04-24 19:11:26.727108475 +0000 UTC m=+251.366228293 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs") pod "network-metrics-daemon-4lz47" (UID: "060d8b4b-7fbe-4109-888d-a5c4822cff6e") : secret "metrics-daemon-secret" not found Apr 24 19:09:25.231608 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:25.231562 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:25.231762 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:25.231699 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:09:25.231762 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:25.231720 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d5479bb4-7cwnd: secret "image-registry-tls" not found Apr 24 19:09:25.231836 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:25.231770 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls podName:1b4671a4-a2bd-4f33-b18e-eead8bfe79fd nodeName:}" failed. No retries permitted until 2026-04-24 19:09:33.231755271 +0000 UTC m=+137.870875094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls") pod "image-registry-55d5479bb4-7cwnd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd") : secret "image-registry-tls" not found Apr 24 19:09:30.570178 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:30.570145 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:30.572433 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:30.572409 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b91e394-5902-49ee-b5d6-296b79e40e07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-464qg\" (UID: \"4b91e394-5902-49ee-b5d6-296b79e40e07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:30.721137 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:30.721114 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xg5wz\"" Apr 24 19:09:30.729112 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:30.729087 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" Apr 24 19:09:30.851498 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:30.851473 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg"] Apr 24 19:09:31.450118 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:31.450078 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" event={"ID":"4b91e394-5902-49ee-b5d6-296b79e40e07","Type":"ContainerStarted","Data":"9eb92b5b2697ab093de4b96ad7f8b22ef20c53455ce9bd3ecaaf100139dfaa86"} Apr 24 19:09:33.292355 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:33.292318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:33.294773 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:33.294747 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"image-registry-55d5479bb4-7cwnd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:33.428315 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:33.428293 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:33.457238 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:33.457210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" event={"ID":"4b91e394-5902-49ee-b5d6-296b79e40e07","Type":"ContainerStarted","Data":"5664214fb397c349042fcc922204287bbbafd972f827fcdf1e56671f479b51ed"} Apr 24 19:09:33.457238 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:33.457240 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" event={"ID":"4b91e394-5902-49ee-b5d6-296b79e40e07","Type":"ContainerStarted","Data":"12a9f284da00ec222b1179bf8ebb135fec3f365fe8e0c62148af88a177ca51f3"} Apr 24 19:09:33.486268 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:33.486227 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-464qg" podStartSLOduration=17.818992722 podStartE2EDuration="19.486212584s" podCreationTimestamp="2026-04-24 19:09:14 +0000 UTC" firstStartedPulling="2026-04-24 19:09:30.893173063 +0000 UTC m=+135.532292882" lastFinishedPulling="2026-04-24 19:09:32.560392926 +0000 UTC m=+137.199512744" observedRunningTime="2026-04-24 19:09:33.485428368 +0000 UTC m=+138.124548233" watchObservedRunningTime="2026-04-24 19:09:33.486212584 +0000 UTC m=+138.125332424" Apr 24 19:09:33.559125 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:33.559067 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55d5479bb4-7cwnd"] Apr 24 19:09:33.561740 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:33.561713 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4671a4_a2bd_4f33_b18e_eead8bfe79fd.slice/crio-f6c294dd43497a128dee0b466d55432ff87cc790b33ca9dc231e24039980596c WatchSource:0}: Error finding container f6c294dd43497a128dee0b466d55432ff87cc790b33ca9dc231e24039980596c: Status 404 returned error can't find the container with id f6c294dd43497a128dee0b466d55432ff87cc790b33ca9dc231e24039980596c Apr 24 19:09:34.461461 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:34.461421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" event={"ID":"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd","Type":"ContainerStarted","Data":"345c97acbb5354834fb43a21f7d973351c03b4a6d1ce2d13f6c6321e92dd9014"} Apr 24 19:09:34.461461 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:34.461462 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" event={"ID":"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd","Type":"ContainerStarted","Data":"f6c294dd43497a128dee0b466d55432ff87cc790b33ca9dc231e24039980596c"} Apr 24 19:09:34.461872 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:34.461529 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:46.338820 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.338769 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" podStartSLOduration=29.338755023 podStartE2EDuration="29.338755023s" podCreationTimestamp="2026-04-24 19:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:34.494253697 +0000 UTC m=+139.133373540" watchObservedRunningTime="2026-04-24 19:09:46.338755023 +0000 UTC m=+150.977874863" Apr 24 19:09:46.339431 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.339414 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9bv9b"] Apr 24 19:09:46.344101 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.344081 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.346327 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.346301 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fsgkc\"" Apr 24 19:09:46.346788 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.346769 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 19:09:46.348720 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.348701 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 19:09:46.356462 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.356443 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9bv9b"] Apr 24 19:09:46.366607 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.366586 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55d5479bb4-7cwnd"] Apr 24 19:09:46.418256 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.418232 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6686b677f9-mncpn"] Apr 24 19:09:46.421327 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.421312 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.449353 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.449334 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6686b677f9-mncpn"] Apr 24 19:09:46.488333 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488313 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2e7f038-e2aa-4900-9d66-5fb67c767701-crio-socket\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.488441 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488347 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-bound-sa-token\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.488441 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kgz\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-kube-api-access-79kgz\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.488559 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7d54045-6d96-4b03-8fa0-b6bab817dab6-registry-certificates\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.488559 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488468 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7d54045-6d96-4b03-8fa0-b6bab817dab6-installation-pull-secrets\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.488559 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2e7f038-e2aa-4900-9d66-5fb67c767701-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.488673 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488585 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ltz\" (UniqueName: \"kubernetes.io/projected/c2e7f038-e2aa-4900-9d66-5fb67c767701-kube-api-access-g5ltz\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.488673 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488659 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7d54045-6d96-4b03-8fa0-b6bab817dab6-trusted-ca\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.488734 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488685 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-registry-tls\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.488734 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2e7f038-e2aa-4900-9d66-5fb67c767701-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.488794 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2e7f038-e2aa-4900-9d66-5fb67c767701-data-volume\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.488794 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488756 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7d54045-6d96-4b03-8fa0-b6bab817dab6-image-registry-private-configuration\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.488794 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.488791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7d54045-6d96-4b03-8fa0-b6bab817dab6-ca-trust-extracted\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589224 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7d54045-6d96-4b03-8fa0-b6bab817dab6-registry-certificates\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589224 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589194 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7d54045-6d96-4b03-8fa0-b6bab817dab6-installation-pull-secrets\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589381 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2e7f038-e2aa-4900-9d66-5fb67c767701-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.589381 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589259 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5ltz\" (UniqueName: \"kubernetes.io/projected/c2e7f038-e2aa-4900-9d66-5fb67c767701-kube-api-access-g5ltz\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.589381 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7d54045-6d96-4b03-8fa0-b6bab817dab6-trusted-ca\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589381 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589311 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-registry-tls\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589381 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589340 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2e7f038-e2aa-4900-9d66-5fb67c767701-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.589381 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2e7f038-e2aa-4900-9d66-5fb67c767701-data-volume\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.589683 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7d54045-6d96-4b03-8fa0-b6bab817dab6-image-registry-private-configuration\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589683 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7d54045-6d96-4b03-8fa0-b6bab817dab6-ca-trust-extracted\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589683 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2e7f038-e2aa-4900-9d66-5fb67c767701-crio-socket\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.589683 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589497 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-bound-sa-token\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589683 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589525 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79kgz\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-kube-api-access-79kgz\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.589930 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589800 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2e7f038-e2aa-4900-9d66-5fb67c767701-crio-socket\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.589930 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2e7f038-e2aa-4900-9d66-5fb67c767701-data-volume\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.590068 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.589941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7d54045-6d96-4b03-8fa0-b6bab817dab6-ca-trust-extracted\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.590234 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.590215 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7d54045-6d96-4b03-8fa0-b6bab817dab6-registry-certificates\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.590296 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.590244 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2e7f038-e2aa-4900-9d66-5fb67c767701-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.590768 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.590736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7d54045-6d96-4b03-8fa0-b6bab817dab6-trusted-ca\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.591965 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.591933 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7d54045-6d96-4b03-8fa0-b6bab817dab6-installation-pull-secrets\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.592234 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.592211 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7d54045-6d96-4b03-8fa0-b6bab817dab6-image-registry-private-configuration\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.592299 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.592253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2e7f038-e2aa-4900-9d66-5fb67c767701-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.592299 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.592272 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-registry-tls\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.601529 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.601503 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-bound-sa-token\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.602704 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.602681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kgz\" (UniqueName: \"kubernetes.io/projected/b7d54045-6d96-4b03-8fa0-b6bab817dab6-kube-api-access-79kgz\") pod \"image-registry-6686b677f9-mncpn\" (UID: \"b7d54045-6d96-4b03-8fa0-b6bab817dab6\") " pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.603311 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.603291 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5ltz\" (UniqueName: \"kubernetes.io/projected/c2e7f038-e2aa-4900-9d66-5fb67c767701-kube-api-access-g5ltz\") pod \"insights-runtime-extractor-9bv9b\" (UID: \"c2e7f038-e2aa-4900-9d66-5fb67c767701\") " pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.652380 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.652359 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9bv9b" Apr 24 19:09:46.729355 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.729330 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:46.774336 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.774311 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9bv9b"] Apr 24 19:09:46.776616 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:46.776581 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e7f038_e2aa_4900_9d66_5fb67c767701.slice/crio-121073252c21af2e410043daab7933bfe824f34b1d4dad8d724434acbcd28983 WatchSource:0}: Error finding container 121073252c21af2e410043daab7933bfe824f34b1d4dad8d724434acbcd28983: Status 404 returned error can't find the container with id 121073252c21af2e410043daab7933bfe824f34b1d4dad8d724434acbcd28983 Apr 24 19:09:46.858697 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:46.858672 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6686b677f9-mncpn"] Apr 24 19:09:46.863811 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:46.863779 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7d54045_6d96_4b03_8fa0_b6bab817dab6.slice/crio-0aebc981f80bcb000d6cf5871cc2337aeb1de0b0700298239f23a8b1374813d1 WatchSource:0}: Error finding container 0aebc981f80bcb000d6cf5871cc2337aeb1de0b0700298239f23a8b1374813d1: Status 404 returned error can't find the container with id 0aebc981f80bcb000d6cf5871cc2337aeb1de0b0700298239f23a8b1374813d1 Apr 24 19:09:47.492862 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:47.492836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9bv9b" event={"ID":"c2e7f038-e2aa-4900-9d66-5fb67c767701","Type":"ContainerStarted","Data":"ce0815cedbf703a822a7f6aed246e566609801cae26b72c6e1837cf4ac2f970a"} Apr 24 19:09:47.493229 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:47.492875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9bv9b" event={"ID":"c2e7f038-e2aa-4900-9d66-5fb67c767701","Type":"ContainerStarted","Data":"2cd07bc1b3475a2030706695425ba0231e278de9fa69e5debcb0e75c23de3cac"} Apr 24 19:09:47.493229 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:47.492892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9bv9b" event={"ID":"c2e7f038-e2aa-4900-9d66-5fb67c767701","Type":"ContainerStarted","Data":"121073252c21af2e410043daab7933bfe824f34b1d4dad8d724434acbcd28983"} Apr 24 19:09:47.494153 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:47.494128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6686b677f9-mncpn" event={"ID":"b7d54045-6d96-4b03-8fa0-b6bab817dab6","Type":"ContainerStarted","Data":"98018d1fd649171e305cbc8777e765999804b7152328a52ffc36bd140e4ad539"} Apr 24 19:09:47.494153 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:47.494160 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6686b677f9-mncpn" event={"ID":"b7d54045-6d96-4b03-8fa0-b6bab817dab6","Type":"ContainerStarted","Data":"0aebc981f80bcb000d6cf5871cc2337aeb1de0b0700298239f23a8b1374813d1"} Apr 24 19:09:47.494340 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:47.494259 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:09:47.533413 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:47.533375 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6686b677f9-mncpn" podStartSLOduration=1.533365103 podStartE2EDuration="1.533365103s" podCreationTimestamp="2026-04-24 19:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:47.533116705 +0000 UTC m=+152.172236545" watchObservedRunningTime="2026-04-24 19:09:47.533365103 +0000 UTC m=+152.172484945" Apr 24 19:09:49.501794 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:49.501759 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9bv9b" event={"ID":"c2e7f038-e2aa-4900-9d66-5fb67c767701","Type":"ContainerStarted","Data":"cda6b7d3961fa6641f458787d25c2900e6b5d179fcef93504199a0b269727682"} Apr 24 19:09:49.532765 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:49.532722 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9bv9b" podStartSLOduration=1.537907943 podStartE2EDuration="3.53270736s" podCreationTimestamp="2026-04-24 19:09:46 +0000 UTC" firstStartedPulling="2026-04-24 19:09:46.839001772 +0000 UTC m=+151.478121595" lastFinishedPulling="2026-04-24 19:09:48.833801192 +0000 UTC m=+153.472921012" observedRunningTime="2026-04-24 19:09:49.531093885 +0000 UTC m=+154.170213725" watchObservedRunningTime="2026-04-24 19:09:49.53270736 +0000 UTC m=+154.171827203" Apr 24 19:09:49.734723 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:49.734686 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zxshv" podUID="e703cafc-bfc2-4649-968d-ef6e4318694a" Apr 24 19:09:49.756105 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:49.756055 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jlkwm" podUID="7fae1c6b-197d-49e4-a9eb-9b922eaa6f48" Apr 24 19:09:50.504205 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:50.504177 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:09:50.991000 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:50.990969 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4lz47" podUID="060d8b4b-7fbe-4109-888d-a5c4822cff6e" Apr 24 19:09:51.868727 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.868695 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5xljk"] Apr 24 19:09:51.872051 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.872032 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:51.875035 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.875015 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 19:09:51.875645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.875599 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 19:09:51.875645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.875622 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 19:09:51.875824 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.875655 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 19:09:51.876142 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.876125 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-4d7hh\"" Apr 24 19:09:51.878776 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.878756 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 19:09:51.889734 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.889720 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5xljk"] Apr 24 19:09:51.928335 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.928313 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ed520f7-8b69-424e-a4fc-e91657a114ee-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:51.928430 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.928360 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:51.928430 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.928387 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:51.928430 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:51.928418 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59qc6\" (UniqueName: \"kubernetes.io/projected/2ed520f7-8b69-424e-a4fc-e91657a114ee-kube-api-access-59qc6\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.029523 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.029498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ed520f7-8b69-424e-a4fc-e91657a114ee-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.029636 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.029569 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.029636 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.029611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.029758 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.029654 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59qc6\" (UniqueName: \"kubernetes.io/projected/2ed520f7-8b69-424e-a4fc-e91657a114ee-kube-api-access-59qc6\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.029815 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:52.029774 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 19:09:52.029870 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:09:52.029823 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-tls podName:2ed520f7-8b69-424e-a4fc-e91657a114ee nodeName:}" failed. No retries permitted until 2026-04-24 19:09:52.529809117 +0000 UTC m=+157.168928936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-5xljk" (UID: "2ed520f7-8b69-424e-a4fc-e91657a114ee") : secret "prometheus-operator-tls" not found Apr 24 19:09:52.030267 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.030241 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ed520f7-8b69-424e-a4fc-e91657a114ee-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.032004 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.031982 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.038503 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.038485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59qc6\" (UniqueName: \"kubernetes.io/projected/2ed520f7-8b69-424e-a4fc-e91657a114ee-kube-api-access-59qc6\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.532394 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.532339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.534444 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.534426 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ed520f7-8b69-424e-a4fc-e91657a114ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5xljk\" (UID: \"2ed520f7-8b69-424e-a4fc-e91657a114ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.781299 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.781276 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" Apr 24 19:09:52.900623 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:52.900521 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5xljk"] Apr 24 19:09:52.902949 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:52.902915 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed520f7_8b69_424e_a4fc_e91657a114ee.slice/crio-79b035796e39c466f584a34da8c8f27b60133b99cf4410960bb1ed2279299f61 WatchSource:0}: Error finding container 79b035796e39c466f584a34da8c8f27b60133b99cf4410960bb1ed2279299f61: Status 404 returned error can't find the container with id 79b035796e39c466f584a34da8c8f27b60133b99cf4410960bb1ed2279299f61 Apr 24 19:09:53.512002 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:53.511949 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" event={"ID":"2ed520f7-8b69-424e-a4fc-e91657a114ee","Type":"ContainerStarted","Data":"79b035796e39c466f584a34da8c8f27b60133b99cf4410960bb1ed2279299f61"} Apr 24 19:09:54.516088 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.516049 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" event={"ID":"2ed520f7-8b69-424e-a4fc-e91657a114ee","Type":"ContainerStarted","Data":"bd88b0366ba7e3871dbe36713e4e956eb6fb22e978865033053ab9d005a3be60"} Apr 24 19:09:54.516088 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.516089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" event={"ID":"2ed520f7-8b69-424e-a4fc-e91657a114ee","Type":"ContainerStarted","Data":"0600ef60437e6dcbb2dd39b3596f76de90604f8037561317d956ece3ce26d23a"} Apr 24 19:09:54.647290 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.647257 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:09:54.649603 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.649579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e703cafc-bfc2-4649-968d-ef6e4318694a-cert\") pod \"ingress-canary-zxshv\" (UID: \"e703cafc-bfc2-4649-968d-ef6e4318694a\") " pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:09:54.708114 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.708093 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tkcsj\"" Apr 24 19:09:54.716504 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.716488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zxshv" Apr 24 19:09:54.747898 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.747874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:09:54.750024 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.749995 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fae1c6b-197d-49e4-a9eb-9b922eaa6f48-metrics-tls\") pod \"dns-default-jlkwm\" (UID: \"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48\") " pod="openshift-dns/dns-default-jlkwm" Apr 24 19:09:54.831706 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.831649 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-5xljk" podStartSLOduration=2.738357364 podStartE2EDuration="3.831634476s" podCreationTimestamp="2026-04-24 19:09:51 +0000 UTC" firstStartedPulling="2026-04-24 19:09:52.904733258 +0000 UTC m=+157.543853076" lastFinishedPulling="2026-04-24 19:09:53.998010356 +0000 UTC m=+158.637130188" observedRunningTime="2026-04-24 19:09:54.567284112 +0000 UTC m=+159.206403952" watchObservedRunningTime="2026-04-24 19:09:54.831634476 +0000 UTC m=+159.470754308" Apr 24 19:09:54.832008 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:54.831987 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zxshv"] Apr 24 19:09:54.834249 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:54.834221 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode703cafc_bfc2_4649_968d_ef6e4318694a.slice/crio-7650175bb4e00f571512bdcdd10c830223423df565bb7357236ec6aa10e7b8fe WatchSource:0}: Error finding container 7650175bb4e00f571512bdcdd10c830223423df565bb7357236ec6aa10e7b8fe: Status 404 returned error can't find the container with id 7650175bb4e00f571512bdcdd10c830223423df565bb7357236ec6aa10e7b8fe Apr 24 19:09:55.520010 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:55.519929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zxshv" event={"ID":"e703cafc-bfc2-4649-968d-ef6e4318694a","Type":"ContainerStarted","Data":"7650175bb4e00f571512bdcdd10c830223423df565bb7357236ec6aa10e7b8fe"} Apr 24 19:09:56.372498 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.372474 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:09:56.431135 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.431109 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9tscd"] Apr 24 19:09:56.435533 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.435511 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.438843 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.438630 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 19:09:56.438843 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.438634 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xnczg\"" Apr 24 19:09:56.444004 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.441468 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 19:09:56.444137 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.444078 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 19:09:56.444137 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.444118 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-vkb2m"] Apr 24 19:09:56.449022 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.449002 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.451340 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.451321 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 19:09:56.451481 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.451462 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-dlvtm\"" Apr 24 19:09:56.451676 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.451656 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 19:09:56.451745 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.451551 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 19:09:56.457558 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.457432 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-vkb2m"] Apr 24 19:09:56.560485 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-wtmp\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-accelerators-collector-config\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmkfj\" (UniqueName: \"kubernetes.io/projected/53337fb5-281e-43b2-ac91-aa328517d13c-kube-api-access-jmkfj\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560586 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560627 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560651 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560683 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560750 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-tls\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-sys\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560837 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-root\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.560906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4grtt\" (UniqueName: \"kubernetes.io/projected/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-api-access-4grtt\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.561282 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560934 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.561282 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.560992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-textfile\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.561282 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.561032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53337fb5-281e-43b2-ac91-aa328517d13c-metrics-client-ca\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.661785 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661710 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-wtmp\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.661785 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661760 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-accelerators-collector-config\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmkfj\" (UniqueName: \"kubernetes.io/projected/53337fb5-281e-43b2-ac91-aa328517d13c-kube-api-access-jmkfj\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661818 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661860 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661886 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-wtmp\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.661997 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-tls\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662383 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662383 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-sys\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662383 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-root\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662383 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662178 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-root\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662383 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662226 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53337fb5-281e-43b2-ac91-aa328517d13c-sys\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662467 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-accelerators-collector-config\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4grtt\" (UniqueName: \"kubernetes.io/projected/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-api-access-4grtt\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662561 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-textfile\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53337fb5-281e-43b2-ac91-aa328517d13c-metrics-client-ca\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.662928 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662675 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.662928 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.662877 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.663381 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.663358 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-textfile\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.663616 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.663592 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53337fb5-281e-43b2-ac91-aa328517d13c-metrics-client-ca\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.665369 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.665347 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.665369 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.665361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-tls\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.665502 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.665394 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53337fb5-281e-43b2-ac91-aa328517d13c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.665502 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.665480 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.670231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.670202 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmkfj\" (UniqueName: \"kubernetes.io/projected/53337fb5-281e-43b2-ac91-aa328517d13c-kube-api-access-jmkfj\") pod \"node-exporter-9tscd\" (UID: \"53337fb5-281e-43b2-ac91-aa328517d13c\") " pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.671316 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.671274 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4grtt\" (UniqueName: \"kubernetes.io/projected/cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa-kube-api-access-4grtt\") pod \"kube-state-metrics-69db897b98-vkb2m\" (UID: \"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.753328 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.753193 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9tscd" Apr 24 19:09:56.762422 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.762401 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" Apr 24 19:09:56.863993 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:56.863937 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53337fb5_281e_43b2_ac91_aa328517d13c.slice/crio-f97b205491b40814db196b45e35f5808e4be2d61e84d8b2f68d9f1bbd67f52cc WatchSource:0}: Error finding container f97b205491b40814db196b45e35f5808e4be2d61e84d8b2f68d9f1bbd67f52cc: Status 404 returned error can't find the container with id f97b205491b40814db196b45e35f5808e4be2d61e84d8b2f68d9f1bbd67f52cc Apr 24 19:09:56.987447 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:56.987425 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-vkb2m"] Apr 24 19:09:56.990263 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:09:56.990237 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdef82cb_4f2d_4d39_99ad_0c822bb4a6fa.slice/crio-bf26001d9b67f2673e1b847da70a9be6389e9ded05b8de52b5ae39962685aa63 WatchSource:0}: Error finding container bf26001d9b67f2673e1b847da70a9be6389e9ded05b8de52b5ae39962685aa63: Status 404 returned error can't find the container with id bf26001d9b67f2673e1b847da70a9be6389e9ded05b8de52b5ae39962685aa63 Apr 24 19:09:57.528142 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:57.528105 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" event={"ID":"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa","Type":"ContainerStarted","Data":"bf26001d9b67f2673e1b847da70a9be6389e9ded05b8de52b5ae39962685aa63"} Apr 24 19:09:57.529509 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:57.529439 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9tscd" event={"ID":"53337fb5-281e-43b2-ac91-aa328517d13c","Type":"ContainerStarted","Data":"f97b205491b40814db196b45e35f5808e4be2d61e84d8b2f68d9f1bbd67f52cc"} Apr 24 19:09:57.533142 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:57.533108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zxshv" event={"ID":"e703cafc-bfc2-4649-968d-ef6e4318694a","Type":"ContainerStarted","Data":"43a3c20f670b3fea72d584296bb9adf84ec1c9b54299ca14f09bc8ad556ef4d2"} Apr 24 19:09:57.554600 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:57.554519 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zxshv" podStartSLOduration=129.473090148 podStartE2EDuration="2m11.554503809s" podCreationTimestamp="2026-04-24 19:07:46 +0000 UTC" firstStartedPulling="2026-04-24 19:09:54.836024196 +0000 UTC m=+159.475144019" lastFinishedPulling="2026-04-24 19:09:56.917437854 +0000 UTC m=+161.556557680" observedRunningTime="2026-04-24 19:09:57.552541564 +0000 UTC m=+162.191661417" watchObservedRunningTime="2026-04-24 19:09:57.554503809 +0000 UTC m=+162.193623651" Apr 24 19:09:58.537919 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:58.537877 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" event={"ID":"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa","Type":"ContainerStarted","Data":"f92c4b2f14daeaa42e425d6cc9b3daf293405d327266d6acd7dcf92528923433"} Apr 24 19:09:58.537919 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:58.537922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" event={"ID":"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa","Type":"ContainerStarted","Data":"40b36b6e4c8c9fa6e07d0fa93189d77bc5af4ba02bb19505ba2c7bb9f2fc55aa"} Apr 24 19:09:58.538432 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:58.537936 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" event={"ID":"cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa","Type":"ContainerStarted","Data":"5432688cd51decbae911c50af7264f881dabeb8a822f70f597761af27ff6e8cd"} Apr 24 19:09:58.539411 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:58.539388 2568 generic.go:358] "Generic (PLEG): container finished" podID="53337fb5-281e-43b2-ac91-aa328517d13c" containerID="fcaea7419d33e12a8083c85cd1f8b5f2d971bcb76d2f50e8ee12394d71cabaea" exitCode=0 Apr 24 19:09:58.539534 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:58.539476 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9tscd" event={"ID":"53337fb5-281e-43b2-ac91-aa328517d13c","Type":"ContainerDied","Data":"fcaea7419d33e12a8083c85cd1f8b5f2d971bcb76d2f50e8ee12394d71cabaea"} Apr 24 19:09:58.569391 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:58.569338 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-vkb2m" podStartSLOduration=1.420701241 podStartE2EDuration="2.569323723s" podCreationTimestamp="2026-04-24 19:09:56 +0000 UTC" firstStartedPulling="2026-04-24 19:09:56.992638159 +0000 UTC m=+161.631757987" lastFinishedPulling="2026-04-24 19:09:58.141260627 +0000 UTC m=+162.780380469" observedRunningTime="2026-04-24 19:09:58.568294089 +0000 UTC m=+163.207413929" watchObservedRunningTime="2026-04-24 19:09:58.569323723 +0000 UTC m=+163.208443590" Apr 24 19:09:59.544220 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:59.544179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9tscd" event={"ID":"53337fb5-281e-43b2-ac91-aa328517d13c","Type":"ContainerStarted","Data":"a90da15926e1ce4e3525378994e93524574c6ebbc9cfd647828d1766f7a43811"} Apr 24 19:09:59.544596 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:59.544228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9tscd" event={"ID":"53337fb5-281e-43b2-ac91-aa328517d13c","Type":"ContainerStarted","Data":"9678114326f88f8adcc293a3a8d4921523420224752ba351c2df548168918ae9"} Apr 24 19:09:59.575139 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:09:59.575082 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9tscd" podStartSLOduration=2.7580950619999998 podStartE2EDuration="3.575071209s" podCreationTimestamp="2026-04-24 19:09:56 +0000 UTC" firstStartedPulling="2026-04-24 19:09:56.865908839 +0000 UTC m=+161.505028661" lastFinishedPulling="2026-04-24 19:09:57.682884983 +0000 UTC m=+162.322004808" observedRunningTime="2026-04-24 19:09:59.574189047 +0000 UTC m=+164.213308903" watchObservedRunningTime="2026-04-24 19:09:59.575071209 +0000 UTC m=+164.214191050" Apr 24 19:10:02.689428 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.689397 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:10:02.693551 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.693530 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.696294 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.696263 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 19:10:02.696733 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.696709 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 19:10:02.696865 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.696848 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 19:10:02.697561 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.697540 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 19:10:02.697675 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.697552 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 19:10:02.697675 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.697605 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-79lp5\"" Apr 24 19:10:02.697915 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.697895 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 19:10:02.698379 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.698232 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 19:10:02.698379 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.698266 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 19:10:02.701162 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.701127 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6gd5cqnsfnrd4\"" Apr 24 19:10:02.701366 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.701350 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 19:10:02.701499 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.701360 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 19:10:02.701499 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.701432 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 19:10:02.709944 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.709884 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 19:10:02.711156 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.711135 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 19:10:02.721047 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.721009 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:10:02.816581 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k46fc\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-kube-api-access-k46fc\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.816581 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816581 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.816758 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.816758 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.816758 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816666 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.816758 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.816758 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816761 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816834 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816867 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816910 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.816986 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-config-out\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.817008 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.817040 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-config\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.817053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817318 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.817070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817318 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.817105 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-web-config\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.817318 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.817121 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.917841 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.917806 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.917841 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.917841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-config\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918064 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.917861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918064 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.917888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918064 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.917936 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-web-config\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918064 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.917985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918064 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.918011 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k46fc\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-kube-api-access-k46fc\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918064 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.918040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918064 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.918065 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918420 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.918090 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.918420 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.918111 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.919385 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919360 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.919658 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919634 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.919739 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919692 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.919739 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.919854 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919768 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.919854 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.919854 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.920034 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919865 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.920034 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.919898 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-config-out\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.920680 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.920660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.921093 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.920983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.921385 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.921358 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-config\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.922443 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.922108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.923568 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.923541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-config-out\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.923807 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.923785 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.924567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.924543 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.925275 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.924899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.925275 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.925162 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.925275 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.925216 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.925275 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.925235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.925510 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.925325 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.926051 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.925945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.926348 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.926324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-web-config\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.927170 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.927153 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.927393 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.927372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k46fc\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-kube-api-access-k46fc\") pod \"prometheus-k8s-0\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:02.979333 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.979222 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jlkwm" Apr 24 19:10:02.982434 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.982419 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d7j6w\"" Apr 24 19:10:02.990119 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:02.990097 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jlkwm" Apr 24 19:10:03.009682 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:03.009258 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:03.134874 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:03.134830 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jlkwm"] Apr 24 19:10:03.139348 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:10:03.139317 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fae1c6b_197d_49e4_a9eb_9b922eaa6f48.slice/crio-8da3b13d090bcaaedff7dbd643a503328b1d15ba8fa07dfe5192ddd20bb75b8a WatchSource:0}: Error finding container 8da3b13d090bcaaedff7dbd643a503328b1d15ba8fa07dfe5192ddd20bb75b8a: Status 404 returned error can't find the container with id 8da3b13d090bcaaedff7dbd643a503328b1d15ba8fa07dfe5192ddd20bb75b8a Apr 24 19:10:03.168716 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:03.168689 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:10:03.172105 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:10:03.172083 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf596b606_4787_4bf8_ab56_54191d7a1ecb.slice/crio-37adf6cebaf54604b3049de4b47b9e750505545f199518c7d30663ef3eb504d0 WatchSource:0}: Error finding container 37adf6cebaf54604b3049de4b47b9e750505545f199518c7d30663ef3eb504d0: Status 404 returned error can't find the container with id 37adf6cebaf54604b3049de4b47b9e750505545f199518c7d30663ef3eb504d0 Apr 24 19:10:03.556553 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:03.556517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jlkwm" event={"ID":"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48","Type":"ContainerStarted","Data":"8da3b13d090bcaaedff7dbd643a503328b1d15ba8fa07dfe5192ddd20bb75b8a"} Apr 24 19:10:03.557693 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:03.557653 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerStarted","Data":"37adf6cebaf54604b3049de4b47b9e750505545f199518c7d30663ef3eb504d0"} Apr 24 19:10:04.561679 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:04.561648 2568 generic.go:358] "Generic (PLEG): container finished" podID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" exitCode=0 Apr 24 19:10:04.562050 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:04.561725 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8"} Apr 24 19:10:04.978865 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:04.978837 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:10:05.566141 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:05.566104 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jlkwm" event={"ID":"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48","Type":"ContainerStarted","Data":"3bdfd494690f14c4cde5789a9a9e09551dbbb45556c200cee2ef9c8a7ff224f1"} Apr 24 19:10:05.566141 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:05.566145 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jlkwm" event={"ID":"7fae1c6b-197d-49e4-a9eb-9b922eaa6f48","Type":"ContainerStarted","Data":"8d260fc33176f80afa523a19c6fb85a953ce47b7ca68a28c2cfeff8243725902"} Apr 24 19:10:05.566636 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:05.566293 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jlkwm" Apr 24 19:10:05.613136 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:05.613084 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jlkwm" podStartSLOduration=138.15912696 podStartE2EDuration="2m19.613069348s" podCreationTimestamp="2026-04-24 19:07:46 +0000 UTC" firstStartedPulling="2026-04-24 19:10:03.142868953 +0000 UTC m=+167.781988775" lastFinishedPulling="2026-04-24 19:10:04.596811341 +0000 UTC m=+169.235931163" observedRunningTime="2026-04-24 19:10:05.612576035 +0000 UTC m=+170.251695875" watchObservedRunningTime="2026-04-24 19:10:05.613069348 +0000 UTC m=+170.252189189" Apr 24 19:10:07.574185 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:07.574142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerStarted","Data":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} Apr 24 19:10:07.574185 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:07.574189 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerStarted","Data":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} Apr 24 19:10:08.501898 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:08.501865 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6686b677f9-mncpn" Apr 24 19:10:09.583598 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:09.583573 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerStarted","Data":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} Apr 24 19:10:09.583944 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:09.583607 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerStarted","Data":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} Apr 24 19:10:09.583944 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:09.583617 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerStarted","Data":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} Apr 24 19:10:09.583944 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:09.583625 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerStarted","Data":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} Apr 24 19:10:09.623531 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:09.623490 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.831993009 podStartE2EDuration="7.623475761s" podCreationTimestamp="2026-04-24 19:10:02 +0000 UTC" firstStartedPulling="2026-04-24 19:10:03.174148837 +0000 UTC m=+167.813268662" lastFinishedPulling="2026-04-24 19:10:08.965631593 +0000 UTC m=+173.604751414" observedRunningTime="2026-04-24 19:10:09.620835457 +0000 UTC m=+174.259955328" watchObservedRunningTime="2026-04-24 19:10:09.623475761 +0000 UTC m=+174.262595601" Apr 24 19:10:11.384693 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.384629 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" podUID="1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" containerName="registry" containerID="cri-o://345c97acbb5354834fb43a21f7d973351c03b4a6d1ce2d13f6c6321e92dd9014" gracePeriod=30 Apr 24 19:10:11.589506 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.589473 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" containerID="345c97acbb5354834fb43a21f7d973351c03b4a6d1ce2d13f6c6321e92dd9014" exitCode=0 Apr 24 19:10:11.589624 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.589546 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" event={"ID":"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd","Type":"ContainerDied","Data":"345c97acbb5354834fb43a21f7d973351c03b4a6d1ce2d13f6c6321e92dd9014"} Apr 24 19:10:11.612002 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.611981 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:10:11.701709 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.701644 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-ca-trust-extracted\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.701709 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.701681 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-installation-pull-secrets\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.701709 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.701703 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-certificates\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.701930 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.701721 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-image-registry-private-configuration\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.701930 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.701766 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-bound-sa-token\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.701930 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.701888 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66x9q\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-kube-api-access-66x9q\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.701930 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.701925 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-trusted-ca\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.702168 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.702150 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") pod \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\" (UID: \"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd\") " Apr 24 19:10:11.702545 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.702476 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:11.703169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.703144 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:11.704485 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.704445 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:11.704485 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.704457 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-kube-api-access-66x9q" (OuterVolumeSpecName: "kube-api-access-66x9q") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "kube-api-access-66x9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:11.704655 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.704646 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:11.704906 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.704878 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:11.705125 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.705100 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:11.710292 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.710266 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" (UID: "1b4671a4-a2bd-4f33-b18e-eead8bfe79fd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:10:11.803713 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803687 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-tls\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:11.803713 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803711 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-ca-trust-extracted\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:11.803827 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803721 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-installation-pull-secrets\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:11.803827 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803731 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-registry-certificates\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:11.803827 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803741 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-image-registry-private-configuration\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:11.803827 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803751 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-bound-sa-token\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:11.803827 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803759 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-66x9q\" (UniqueName: \"kubernetes.io/projected/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-kube-api-access-66x9q\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:11.803827 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:11.803782 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd-trusted-ca\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:10:12.593060 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:12.593030 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" event={"ID":"1b4671a4-a2bd-4f33-b18e-eead8bfe79fd","Type":"ContainerDied","Data":"f6c294dd43497a128dee0b466d55432ff87cc790b33ca9dc231e24039980596c"} Apr 24 19:10:12.593060 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:12.593062 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d5479bb4-7cwnd" Apr 24 19:10:12.593505 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:12.593071 2568 scope.go:117] "RemoveContainer" containerID="345c97acbb5354834fb43a21f7d973351c03b4a6d1ce2d13f6c6321e92dd9014" Apr 24 19:10:12.616669 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:12.616645 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55d5479bb4-7cwnd"] Apr 24 19:10:12.619383 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:12.619362 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55d5479bb4-7cwnd"] Apr 24 19:10:13.010551 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:13.010473 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:13.982922 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:13.982889 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" path="/var/lib/kubelet/pods/1b4671a4-a2bd-4f33-b18e-eead8bfe79fd/volumes" Apr 24 19:10:15.571173 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:15.571142 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jlkwm" Apr 24 19:10:34.653539 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:34.653509 2568 generic.go:358] "Generic (PLEG): container finished" podID="c4d6f584-a9a2-4297-9d42-6683202fc40f" containerID="e0903b8081a2b1b3cd261404f5c8041d3ed564103de60a044cdc24051fe8736f" exitCode=0 Apr 24 19:10:34.653923 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:34.653586 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" event={"ID":"c4d6f584-a9a2-4297-9d42-6683202fc40f","Type":"ContainerDied","Data":"e0903b8081a2b1b3cd261404f5c8041d3ed564103de60a044cdc24051fe8736f"} Apr 24 19:10:34.653923 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:34.653917 2568 scope.go:117] "RemoveContainer" containerID="e0903b8081a2b1b3cd261404f5c8041d3ed564103de60a044cdc24051fe8736f" Apr 24 19:10:34.655203 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:34.655179 2568 generic.go:358] "Generic (PLEG): container finished" podID="44f2135d-60e2-4ac2-9dc5-8f2de4ca429c" containerID="fcff2afec9cf768f43e5eab9747347b9bd7b938c94375b829ace9ca9d74ac876" exitCode=0 Apr 24 19:10:34.655309 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:34.655233 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" event={"ID":"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c","Type":"ContainerDied","Data":"fcff2afec9cf768f43e5eab9747347b9bd7b938c94375b829ace9ca9d74ac876"} Apr 24 19:10:34.655500 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:34.655486 2568 scope.go:117] "RemoveContainer" containerID="fcff2afec9cf768f43e5eab9747347b9bd7b938c94375b829ace9ca9d74ac876" Apr 24 19:10:35.659175 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:35.659135 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jqb5j" event={"ID":"44f2135d-60e2-4ac2-9dc5-8f2de4ca429c","Type":"ContainerStarted","Data":"6d51f554aa2010ac2a72e132f54e399e89ae20138259d479cbc8f8e1f0146a22"} Apr 24 19:10:35.660644 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:35.660620 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxjdh" event={"ID":"c4d6f584-a9a2-4297-9d42-6683202fc40f","Type":"ContainerStarted","Data":"a4a282c8707955692f54bc5e7809ed8229b0e293dc4d9524ea44b6ff8ffdfc32"} Apr 24 19:10:43.683780 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:43.683751 2568 generic.go:358] "Generic (PLEG): container finished" podID="fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa" containerID="36f791330fb1feb7690f22befb3c15aa9f5bbc84b3d326f221fdb07ebaebab27" exitCode=0 Apr 24 19:10:43.684142 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:43.683809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d9pbk" event={"ID":"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa","Type":"ContainerDied","Data":"36f791330fb1feb7690f22befb3c15aa9f5bbc84b3d326f221fdb07ebaebab27"} Apr 24 19:10:43.684142 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:43.684100 2568 scope.go:117] "RemoveContainer" containerID="36f791330fb1feb7690f22befb3c15aa9f5bbc84b3d326f221fdb07ebaebab27" Apr 24 19:10:44.687788 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:10:44.687754 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d9pbk" event={"ID":"fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa","Type":"ContainerStarted","Data":"b9bfac8eb1e9441b19f516bf32d5aa877f69409ad5a8197ea3ebfdd364d9f7e6"} Apr 24 19:11:03.010808 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:03.010774 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:03.029717 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:03.029690 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:03.760186 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:03.760161 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:21.107353 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.107321 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:11:21.107884 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.107834 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-thanos" containerID="cri-o://5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" gracePeriod=600 Apr 24 19:11:21.107990 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.107881 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="config-reloader" containerID="cri-o://ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" gracePeriod=600 Apr 24 19:11:21.107990 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.107883 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy" containerID="cri-o://d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" gracePeriod=600 Apr 24 19:11:21.107990 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.107845 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-web" containerID="cri-o://83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" gracePeriod=600 Apr 24 19:11:21.107990 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.107833 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="thanos-sidecar" containerID="cri-o://852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" gracePeriod=600 Apr 24 19:11:21.108192 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.108005 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="prometheus" containerID="cri-o://e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" gracePeriod=600 Apr 24 19:11:21.349303 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.349023 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:21.424019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.423917 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-web-config\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424019 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.423974 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-tls-assets\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424027 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-rulefiles-0\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424051 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-config-out\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424089 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-metrics-client-certs\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424113 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-metrics-client-ca\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424136 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-thanos-prometheus-http-client-file\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424168 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k46fc\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-kube-api-access-k46fc\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424231 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424208 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424237 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-db\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424268 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-config\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424298 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-tls\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424332 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-grpc-tls\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424392 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-kubelet-serving-ca-bundle\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.424567 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424537 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:21.424869 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.424711 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:21.426046 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.425297 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-trusted-ca-bundle\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.426046 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.425351 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-kube-rbac-proxy\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.426046 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.425400 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-serving-certs-ca-bundle\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.426046 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.425433 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f596b606-4787-4bf8-ab56-54191d7a1ecb\" (UID: \"f596b606-4787-4bf8-ab56-54191d7a1ecb\") " Apr 24 19:11:21.426046 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.425705 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.426046 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.425728 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-metrics-client-ca\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.426046 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.425902 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:11:21.426510 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.426481 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:21.427366 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.427333 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:21.427767 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.427738 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:21.427856 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.427826 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:11:21.428579 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.428539 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-config-out" (OuterVolumeSpecName: "config-out") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:11:21.428579 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.428543 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-config" (OuterVolumeSpecName: "config") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.428579 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.428570 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.428800 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.428594 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.429049 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.429012 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.429533 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.429510 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.429921 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.429901 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.430368 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.430341 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.430703 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.430660 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.430788 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.430699 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-kube-api-access-k46fc" (OuterVolumeSpecName: "kube-api-access-k46fc") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "kube-api-access-k46fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:11:21.438122 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.438084 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-web-config" (OuterVolumeSpecName: "web-config") pod "f596b606-4787-4bf8-ab56-54191d7a1ecb" (UID: "f596b606-4787-4bf8-ab56-54191d7a1ecb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:21.526937 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.526916 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.526937 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.526938 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-grpc-tls\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.526948 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.526979 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-kube-rbac-proxy\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.526988 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.526998 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527007 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-web-config\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527016 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-tls-assets\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527025 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527033 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-config-out\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527041 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-metrics-client-certs\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527051 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527059 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k46fc\" (UniqueName: \"kubernetes.io/projected/f596b606-4787-4bf8-ab56-54191d7a1ecb-kube-api-access-k46fc\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527068 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527074 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527077 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f596b606-4787-4bf8-ab56-54191d7a1ecb-prometheus-k8s-db\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.527469 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.527086 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f596b606-4787-4bf8-ab56-54191d7a1ecb-config\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:11:21.793169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793092 2568 generic.go:358] "Generic (PLEG): container finished" podID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" exitCode=0 Apr 24 19:11:21.793169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793113 2568 generic.go:358] "Generic (PLEG): container finished" podID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" exitCode=0 Apr 24 19:11:21.793169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793119 2568 generic.go:358] "Generic (PLEG): container finished" podID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" exitCode=0 Apr 24 19:11:21.793169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793125 2568 generic.go:358] "Generic (PLEG): container finished" podID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" exitCode=0 Apr 24 19:11:21.793169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793130 2568 generic.go:358] "Generic (PLEG): container finished" podID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" exitCode=0 Apr 24 19:11:21.793169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793135 2568 generic.go:358] "Generic (PLEG): container finished" podID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" exitCode=0 Apr 24 19:11:21.793169 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793159 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793184 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793195 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793204 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793213 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793221 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793224 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793237 2568 scope.go:117] "RemoveContainer" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.793570 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.793228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f596b606-4787-4bf8-ab56-54191d7a1ecb","Type":"ContainerDied","Data":"37adf6cebaf54604b3049de4b47b9e750505545f199518c7d30663ef3eb504d0"} Apr 24 19:11:21.802213 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.802184 2568 scope.go:117] "RemoveContainer" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.808394 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.808378 2568 scope.go:117] "RemoveContainer" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.814429 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.814410 2568 scope.go:117] "RemoveContainer" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.818415 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.818394 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:11:21.820917 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.820884 2568 scope.go:117] "RemoveContainer" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.822146 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.822122 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:11:21.826843 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.826825 2568 scope.go:117] "RemoveContainer" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.833093 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.833078 2568 scope.go:117] "RemoveContainer" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.838894 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.838873 2568 scope.go:117] "RemoveContainer" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.839187 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:11:21.839158 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": container with ID starting with 5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031 not found: ID does not exist" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.839282 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839190 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} err="failed to get container status \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": rpc error: code = NotFound desc = could not find container \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": container with ID starting with 5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031 not found: ID does not exist" Apr 24 19:11:21.839282 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839230 2568 scope.go:117] "RemoveContainer" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.839479 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:11:21.839460 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": container with ID starting with d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f not found: ID does not exist" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.839533 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839485 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} err="failed to get container status \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": rpc error: code = NotFound desc = could not find container \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": container with ID starting with d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f not found: ID does not exist" Apr 24 19:11:21.839533 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839500 2568 scope.go:117] "RemoveContainer" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.839725 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:11:21.839708 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": container with ID starting with 83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f not found: ID does not exist" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.839771 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839731 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} err="failed to get container status \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": rpc error: code = NotFound desc = could not find container \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": container with ID starting with 83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f not found: ID does not exist" Apr 24 19:11:21.839771 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839749 2568 scope.go:117] "RemoveContainer" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.839971 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:11:21.839938 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": container with ID starting with 852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50 not found: ID does not exist" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.840009 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839972 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} err="failed to get container status \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": rpc error: code = NotFound desc = could not find container \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": container with ID starting with 852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50 not found: ID does not exist" Apr 24 19:11:21.840009 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.839986 2568 scope.go:117] "RemoveContainer" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.840199 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:11:21.840180 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": container with ID starting with ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f not found: ID does not exist" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.840236 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840204 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} err="failed to get container status \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": rpc error: code = NotFound desc = could not find container \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": container with ID starting with ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f not found: ID does not exist" Apr 24 19:11:21.840236 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840219 2568 scope.go:117] "RemoveContainer" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.840442 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:11:21.840426 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": container with ID starting with e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84 not found: ID does not exist" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.840498 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840449 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} err="failed to get container status \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": rpc error: code = NotFound desc = could not find container \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": container with ID starting with e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84 not found: ID does not exist" Apr 24 19:11:21.840498 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840470 2568 scope.go:117] "RemoveContainer" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.840704 ip-10-0-129-23 kubenswrapper[2568]: E0424 19:11:21.840683 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": container with ID starting with 9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8 not found: ID does not exist" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.840749 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840708 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8"} err="failed to get container status \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": rpc error: code = NotFound desc = could not find container \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": container with ID starting with 9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8 not found: ID does not exist" Apr 24 19:11:21.840749 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840721 2568 scope.go:117] "RemoveContainer" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.840926 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840906 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} err="failed to get container status \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": rpc error: code = NotFound desc = could not find container \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": container with ID starting with 5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031 not found: ID does not exist" Apr 24 19:11:21.840988 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.840928 2568 scope.go:117] "RemoveContainer" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.841150 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841134 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} err="failed to get container status \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": rpc error: code = NotFound desc = could not find container \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": container with ID starting with d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f not found: ID does not exist" Apr 24 19:11:21.841190 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841151 2568 scope.go:117] "RemoveContainer" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.841334 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841317 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} err="failed to get container status \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": rpc error: code = NotFound desc = could not find container \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": container with ID starting with 83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f not found: ID does not exist" Apr 24 19:11:21.841378 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841337 2568 scope.go:117] "RemoveContainer" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.841539 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841521 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} err="failed to get container status \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": rpc error: code = NotFound desc = could not find container \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": container with ID starting with 852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50 not found: ID does not exist" Apr 24 19:11:21.841608 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841541 2568 scope.go:117] "RemoveContainer" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.841746 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841730 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} err="failed to get container status \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": rpc error: code = NotFound desc = could not find container \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": container with ID starting with ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f not found: ID does not exist" Apr 24 19:11:21.841791 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841746 2568 scope.go:117] "RemoveContainer" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.841934 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841915 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} err="failed to get container status \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": rpc error: code = NotFound desc = could not find container \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": container with ID starting with e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84 not found: ID does not exist" Apr 24 19:11:21.842000 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.841935 2568 scope.go:117] "RemoveContainer" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.842134 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842115 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8"} err="failed to get container status \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": rpc error: code = NotFound desc = could not find container \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": container with ID starting with 9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8 not found: ID does not exist" Apr 24 19:11:21.842196 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842135 2568 scope.go:117] "RemoveContainer" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.842335 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842321 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} err="failed to get container status \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": rpc error: code = NotFound desc = could not find container \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": container with ID starting with 5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031 not found: ID does not exist" Apr 24 19:11:21.842378 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842335 2568 scope.go:117] "RemoveContainer" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.842530 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842507 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} err="failed to get container status \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": rpc error: code = NotFound desc = could not find container \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": container with ID starting with d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f not found: ID does not exist" Apr 24 19:11:21.842577 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842533 2568 scope.go:117] "RemoveContainer" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.842695 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842681 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} err="failed to get container status \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": rpc error: code = NotFound desc = could not find container \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": container with ID starting with 83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f not found: ID does not exist" Apr 24 19:11:21.842734 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842695 2568 scope.go:117] "RemoveContainer" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.842881 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842861 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} err="failed to get container status \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": rpc error: code = NotFound desc = could not find container \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": container with ID starting with 852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50 not found: ID does not exist" Apr 24 19:11:21.842917 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.842883 2568 scope.go:117] "RemoveContainer" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.843091 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843076 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} err="failed to get container status \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": rpc error: code = NotFound desc = could not find container \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": container with ID starting with ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f not found: ID does not exist" Apr 24 19:11:21.843136 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843092 2568 scope.go:117] "RemoveContainer" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.843244 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843231 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} err="failed to get container status \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": rpc error: code = NotFound desc = could not find container \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": container with ID starting with e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84 not found: ID does not exist" Apr 24 19:11:21.843285 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843244 2568 scope.go:117] "RemoveContainer" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.843429 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843414 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8"} err="failed to get container status \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": rpc error: code = NotFound desc = could not find container \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": container with ID starting with 9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8 not found: ID does not exist" Apr 24 19:11:21.843477 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843429 2568 scope.go:117] "RemoveContainer" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.843603 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843589 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} err="failed to get container status \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": rpc error: code = NotFound desc = could not find container \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": container with ID starting with 5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031 not found: ID does not exist" Apr 24 19:11:21.843603 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843602 2568 scope.go:117] "RemoveContainer" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.843781 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843763 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} err="failed to get container status \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": rpc error: code = NotFound desc = could not find container \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": container with ID starting with d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f not found: ID does not exist" Apr 24 19:11:21.843843 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843783 2568 scope.go:117] "RemoveContainer" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.843992 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843978 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} err="failed to get container status \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": rpc error: code = NotFound desc = could not find container \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": container with ID starting with 83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f not found: ID does not exist" Apr 24 19:11:21.844043 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.843993 2568 scope.go:117] "RemoveContainer" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.844182 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844167 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} err="failed to get container status \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": rpc error: code = NotFound desc = could not find container \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": container with ID starting with 852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50 not found: ID does not exist" Apr 24 19:11:21.844230 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844182 2568 scope.go:117] "RemoveContainer" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.844347 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844334 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} err="failed to get container status \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": rpc error: code = NotFound desc = could not find container \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": container with ID starting with ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f not found: ID does not exist" Apr 24 19:11:21.844389 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844347 2568 scope.go:117] "RemoveContainer" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.844527 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844512 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} err="failed to get container status \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": rpc error: code = NotFound desc = could not find container \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": container with ID starting with e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84 not found: ID does not exist" Apr 24 19:11:21.844569 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844528 2568 scope.go:117] "RemoveContainer" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.844729 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844701 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8"} err="failed to get container status \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": rpc error: code = NotFound desc = could not find container \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": container with ID starting with 9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8 not found: ID does not exist" Apr 24 19:11:21.844797 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844730 2568 scope.go:117] "RemoveContainer" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.844940 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844923 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} err="failed to get container status \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": rpc error: code = NotFound desc = could not find container \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": container with ID starting with 5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031 not found: ID does not exist" Apr 24 19:11:21.845014 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.844943 2568 scope.go:117] "RemoveContainer" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.845217 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845199 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} err="failed to get container status \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": rpc error: code = NotFound desc = could not find container \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": container with ID starting with d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f not found: ID does not exist" Apr 24 19:11:21.845261 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845217 2568 scope.go:117] "RemoveContainer" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.845397 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845378 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} err="failed to get container status \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": rpc error: code = NotFound desc = could not find container \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": container with ID starting with 83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f not found: ID does not exist" Apr 24 19:11:21.845397 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845396 2568 scope.go:117] "RemoveContainer" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.845582 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845568 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} err="failed to get container status \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": rpc error: code = NotFound desc = could not find container \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": container with ID starting with 852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50 not found: ID does not exist" Apr 24 19:11:21.845630 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845583 2568 scope.go:117] "RemoveContainer" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.845789 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845769 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} err="failed to get container status \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": rpc error: code = NotFound desc = could not find container \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": container with ID starting with ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f not found: ID does not exist" Apr 24 19:11:21.845789 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845788 2568 scope.go:117] "RemoveContainer" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.846006 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.845986 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} err="failed to get container status \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": rpc error: code = NotFound desc = could not find container \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": container with ID starting with e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84 not found: ID does not exist" Apr 24 19:11:21.846080 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.846007 2568 scope.go:117] "RemoveContainer" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.846298 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.846277 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8"} err="failed to get container status \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": rpc error: code = NotFound desc = could not find container \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": container with ID starting with 9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8 not found: ID does not exist" Apr 24 19:11:21.846298 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.846297 2568 scope.go:117] "RemoveContainer" containerID="5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031" Apr 24 19:11:21.846620 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.846580 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031"} err="failed to get container status \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": rpc error: code = NotFound desc = could not find container \"5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031\": container with ID starting with 5c6668d566074dca25b3e4ecf27895db47faee228e17f1749c01326344c37031 not found: ID does not exist" Apr 24 19:11:21.846620 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.846619 2568 scope.go:117] "RemoveContainer" containerID="d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f" Apr 24 19:11:21.846897 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.846841 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f"} err="failed to get container status \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": rpc error: code = NotFound desc = could not find container \"d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f\": container with ID starting with d82d7c3bd8d2e8e63c5b682df02d2ac233f2bf6ef5fcf84e62dc37b29904ea3f not found: ID does not exist" Apr 24 19:11:21.846897 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.846881 2568 scope.go:117] "RemoveContainer" containerID="83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f" Apr 24 19:11:21.847317 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.847294 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f"} err="failed to get container status \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": rpc error: code = NotFound desc = could not find container \"83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f\": container with ID starting with 83f0acd7d9c1bdd4372f58c8f762bff87b6b97c4549a72bc202fb09538e8bb4f not found: ID does not exist" Apr 24 19:11:21.847406 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.847319 2568 scope.go:117] "RemoveContainer" containerID="852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50" Apr 24 19:11:21.847629 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.847602 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50"} err="failed to get container status \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": rpc error: code = NotFound desc = could not find container \"852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50\": container with ID starting with 852cc2fcf4883496d036afa84d9dc1d8760290841ac63dd0bc78e6aaa8a5cf50 not found: ID does not exist" Apr 24 19:11:21.847688 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.847632 2568 scope.go:117] "RemoveContainer" containerID="ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f" Apr 24 19:11:21.847809 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.847794 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:11:21.847878 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.847858 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f"} err="failed to get container status \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": rpc error: code = NotFound desc = could not find container \"ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f\": container with ID starting with ab8f1a4acdadfdc1a0f98717b8e0510e3b743697a7eb915bd838e814695bd49f not found: ID does not exist" Apr 24 19:11:21.847923 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.847880 2568 scope.go:117] "RemoveContainer" containerID="e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84" Apr 24 19:11:21.848083 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848063 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" containerName="registry" Apr 24 19:11:21.848083 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848083 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" containerName="registry" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848091 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848098 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848110 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="init-config-reloader" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848116 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="init-config-reloader" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848125 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="config-reloader" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848130 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="config-reloader" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848137 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="prometheus" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848142 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="prometheus" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848148 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-thanos" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848153 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-thanos" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848087 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84"} err="failed to get container status \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": rpc error: code = NotFound desc = could not find container \"e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84\": container with ID starting with e39ec7b8d902e424d5381fda310e7d56114aa013182f24d7dc655d1272d07b84 not found: ID does not exist" Apr 24 19:11:21.848198 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848170 2568 scope.go:117] "RemoveContainer" containerID="9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848161 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="thanos-sidecar" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848220 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="thanos-sidecar" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848232 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-web" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848238 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-web" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848305 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-web" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848315 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="prometheus" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848321 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="config-reloader" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848329 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="thanos-sidecar" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848335 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy-thanos" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848342 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b4671a4-a2bd-4f33-b18e-eead8bfe79fd" containerName="registry" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848348 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" containerName="kube-rbac-proxy" Apr 24 19:11:21.848592 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.848405 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8"} err="failed to get container status \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": rpc error: code = NotFound desc = could not find container \"9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8\": container with ID starting with 9712affae6ed4379bcaff6e1012cd084cb5acf7ceb8afa3e4001a6cab327f9b8 not found: ID does not exist" Apr 24 19:11:21.853726 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.853707 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:21.856464 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.856447 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 19:11:21.856553 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.856510 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 19:11:21.856553 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.856521 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-79lp5\"" Apr 24 19:11:21.857317 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857202 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 19:11:21.857417 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857393 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 19:11:21.857417 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857396 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6gd5cqnsfnrd4\"" Apr 24 19:11:21.857529 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857479 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 19:11:21.857529 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857516 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 19:11:21.857529 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857524 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 19:11:21.857692 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857678 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 19:11:21.857832 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.857818 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 19:11:21.858059 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.858043 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 19:11:21.858152 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.858136 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 19:11:21.861829 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.861810 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 19:11:21.865048 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.865032 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 19:11:21.874888 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.870071 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:11:21.982397 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:21.982377 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f596b606-4787-4bf8-ab56-54191d7a1ecb" path="/var/lib/kubelet/pods/f596b606-4787-4bf8-ab56-54191d7a1ecb/volumes" Apr 24 19:11:22.032204 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032270 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032270 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032235 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/01e4c761-3041-405b-9afb-cbcf743364a1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032333 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032382 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48zcn\" (UniqueName: \"kubernetes.io/projected/01e4c761-3041-405b-9afb-cbcf743364a1-kube-api-access-48zcn\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032412 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-web-config\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032427 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032456 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032487 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-config\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032516 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/01e4c761-3041-405b-9afb-cbcf743364a1-config-out\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032590 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032645 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032631 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.032792 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.032654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133105 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133396 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133396 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133128 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133396 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133396 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133396 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133719 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133648 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/01e4c761-3041-405b-9afb-cbcf743364a1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.133885 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.133826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134059 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134166 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134136 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134250 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134185 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134250 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134223 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48zcn\" (UniqueName: \"kubernetes.io/projected/01e4c761-3041-405b-9afb-cbcf743364a1-kube-api-access-48zcn\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134312 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-web-config\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134395 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134432 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-config\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.134805 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.134482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/01e4c761-3041-405b-9afb-cbcf743364a1-config-out\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.135185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.135251 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.135490 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.135695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.137108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.137493 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-web-config\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.137643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.137844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.138756 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/01e4c761-3041-405b-9afb-cbcf743364a1-config-out\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139018 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.139000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/01e4c761-3041-405b-9afb-cbcf743364a1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139634 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.139306 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139634 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.139342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139634 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.139573 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139634 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.139584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-config\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.139944 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.139922 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/01e4c761-3041-405b-9afb-cbcf743364a1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.140344 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.140321 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/01e4c761-3041-405b-9afb-cbcf743364a1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.142215 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.142196 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48zcn\" (UniqueName: \"kubernetes.io/projected/01e4c761-3041-405b-9afb-cbcf743364a1-kube-api-access-48zcn\") pod \"prometheus-k8s-0\" (UID: \"01e4c761-3041-405b-9afb-cbcf743364a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.163244 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.163222 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:22.290418 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.290385 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:11:22.293046 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:11:22.293018 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e4c761_3041_405b_9afb_cbcf743364a1.slice/crio-3b6375b7fc01754b2679464e94561577c3c32c4b6987689d7c1fe46f286c6f63 WatchSource:0}: Error finding container 3b6375b7fc01754b2679464e94561577c3c32c4b6987689d7c1fe46f286c6f63: Status 404 returned error can't find the container with id 3b6375b7fc01754b2679464e94561577c3c32c4b6987689d7c1fe46f286c6f63 Apr 24 19:11:22.797167 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.797136 2568 generic.go:358] "Generic (PLEG): container finished" podID="01e4c761-3041-405b-9afb-cbcf743364a1" containerID="14fddba740cf64b858f73bd766e82a082390a3492b1609db0005592828b22435" exitCode=0 Apr 24 19:11:22.797347 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.797206 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerDied","Data":"14fddba740cf64b858f73bd766e82a082390a3492b1609db0005592828b22435"} Apr 24 19:11:22.797347 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:22.797228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerStarted","Data":"3b6375b7fc01754b2679464e94561577c3c32c4b6987689d7c1fe46f286c6f63"} Apr 24 19:11:23.803770 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:23.803731 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerStarted","Data":"51670c96f15a7f429b32a451da7df9c8f80ba0cac4c9e4fb84f12a7e91fcfafa"} Apr 24 19:11:23.803770 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:23.803766 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerStarted","Data":"5d5331e05c648ef9ee19717748fb63d6c1ee5debba2cffcb961ea30d8a7ed206"} Apr 24 19:11:23.803770 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:23.803778 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerStarted","Data":"ebf664e75e2a60d4634f3a45eaffe3af7332597f2602ab2de8097b78697b0487"} Apr 24 19:11:23.804196 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:23.803787 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerStarted","Data":"ad37f6d82e6c1567b03206396fdd119423a545a7aa48071693a5a40f880f5cea"} Apr 24 19:11:23.804196 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:23.803795 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerStarted","Data":"6aa33a5150bb5e741cd0d2eddaca160052d58a4f025a37c59411e94a3f105c0b"} Apr 24 19:11:23.804196 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:23.803804 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"01e4c761-3041-405b-9afb-cbcf743364a1","Type":"ContainerStarted","Data":"d11825193c41d9d1f1eb17e061254ea4447cccfb4caaab3ce43382a42400e4cc"} Apr 24 19:11:23.831931 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:23.831881 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.831863151 podStartE2EDuration="2.831863151s" podCreationTimestamp="2026-04-24 19:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:11:23.829853751 +0000 UTC m=+248.468973591" watchObservedRunningTime="2026-04-24 19:11:23.831863151 +0000 UTC m=+248.470982993" Apr 24 19:11:26.773068 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:26.772988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:11:26.775227 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:26.775209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/060d8b4b-7fbe-4109-888d-a5c4822cff6e-metrics-certs\") pod \"network-metrics-daemon-4lz47\" (UID: \"060d8b4b-7fbe-4109-888d-a5c4822cff6e\") " pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:11:26.881593 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:26.881563 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h6pgv\"" Apr 24 19:11:26.889575 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:26.889552 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lz47" Apr 24 19:11:27.000716 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:27.000597 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4lz47"] Apr 24 19:11:27.003279 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:11:27.003252 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod060d8b4b_7fbe_4109_888d_a5c4822cff6e.slice/crio-640fb310f10c21231290c927af3b2cbddbbfe09c335ace250344d3cccef6051f WatchSource:0}: Error finding container 640fb310f10c21231290c927af3b2cbddbbfe09c335ace250344d3cccef6051f: Status 404 returned error can't find the container with id 640fb310f10c21231290c927af3b2cbddbbfe09c335ace250344d3cccef6051f Apr 24 19:11:27.163453 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:27.163423 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:27.816511 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:27.816473 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4lz47" event={"ID":"060d8b4b-7fbe-4109-888d-a5c4822cff6e","Type":"ContainerStarted","Data":"640fb310f10c21231290c927af3b2cbddbbfe09c335ace250344d3cccef6051f"} Apr 24 19:11:28.820699 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:28.820656 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4lz47" event={"ID":"060d8b4b-7fbe-4109-888d-a5c4822cff6e","Type":"ContainerStarted","Data":"a5d77b9240e37ead70fa48ec616ffb3d463b9b7397c69e32d7b365e4e9a9279e"} Apr 24 19:11:28.821125 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:28.820708 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4lz47" event={"ID":"060d8b4b-7fbe-4109-888d-a5c4822cff6e","Type":"ContainerStarted","Data":"42e35160fc1809118cd750e5498d5cc2e0b9ac140fdcec1ae8cbda9a54bf59f3"} Apr 24 19:11:28.836635 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:11:28.836593 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4lz47" podStartSLOduration=251.918872272 podStartE2EDuration="4m12.836579024s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:11:27.005141221 +0000 UTC m=+251.644261040" lastFinishedPulling="2026-04-24 19:11:27.922847972 +0000 UTC m=+252.561967792" observedRunningTime="2026-04-24 19:11:28.835004746 +0000 UTC m=+253.474124588" watchObservedRunningTime="2026-04-24 19:11:28.836579024 +0000 UTC m=+253.475698864" Apr 24 19:12:15.871582 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:12:15.871559 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:12:15.872354 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:12:15.872333 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:12:15.878408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:12:15.878390 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 19:12:22.164043 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:12:22.164007 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:12:22.178852 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:12:22.178830 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:12:22.988556 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:12:22.988529 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:13:09.567610 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.567529 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7jtcs"] Apr 24 19:13:09.570432 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.570394 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.572757 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.572736 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 19:13:09.578102 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.578079 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7jtcs"] Apr 24 19:13:09.686533 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.686507 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c1caa964-4c7b-426a-8598-6c1acbee1107-dbus\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.686654 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.686580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c1caa964-4c7b-426a-8598-6c1acbee1107-kubelet-config\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.686654 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.686613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c1caa964-4c7b-426a-8598-6c1acbee1107-original-pull-secret\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.787849 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.787820 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c1caa964-4c7b-426a-8598-6c1acbee1107-dbus\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.787983 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.787877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c1caa964-4c7b-426a-8598-6c1acbee1107-kubelet-config\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.787983 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.787910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c1caa964-4c7b-426a-8598-6c1acbee1107-original-pull-secret\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.787983 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.787949 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c1caa964-4c7b-426a-8598-6c1acbee1107-kubelet-config\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.788105 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.788037 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c1caa964-4c7b-426a-8598-6c1acbee1107-dbus\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.790140 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.790115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c1caa964-4c7b-426a-8598-6c1acbee1107-original-pull-secret\") pod \"global-pull-secret-syncer-7jtcs\" (UID: \"c1caa964-4c7b-426a-8598-6c1acbee1107\") " pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.880404 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.880374 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7jtcs" Apr 24 19:13:09.997484 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:09.997458 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7jtcs"] Apr 24 19:13:10.000551 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:13:10.000524 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1caa964_4c7b_426a_8598_6c1acbee1107.slice/crio-0af64c71935388d17cd1334288c77035f7d11bddd1af837e49ed24941146a6f8 WatchSource:0}: Error finding container 0af64c71935388d17cd1334288c77035f7d11bddd1af837e49ed24941146a6f8: Status 404 returned error can't find the container with id 0af64c71935388d17cd1334288c77035f7d11bddd1af837e49ed24941146a6f8 Apr 24 19:13:10.002067 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:10.002052 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:13:10.105499 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:10.105471 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7jtcs" event={"ID":"c1caa964-4c7b-426a-8598-6c1acbee1107","Type":"ContainerStarted","Data":"0af64c71935388d17cd1334288c77035f7d11bddd1af837e49ed24941146a6f8"} Apr 24 19:13:14.119666 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:14.119631 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7jtcs" event={"ID":"c1caa964-4c7b-426a-8598-6c1acbee1107","Type":"ContainerStarted","Data":"edd641ee6d8517f50c717fde84e8ec09cea9fcc79eae33ac9494f3012fa66308"} Apr 24 19:13:14.142919 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:13:14.142872 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7jtcs" podStartSLOduration=1.224487922 podStartE2EDuration="5.142857451s" podCreationTimestamp="2026-04-24 19:13:09 +0000 UTC" firstStartedPulling="2026-04-24 19:13:10.002174714 +0000 UTC m=+354.641294533" lastFinishedPulling="2026-04-24 19:13:13.920544238 +0000 UTC m=+358.559664062" observedRunningTime="2026-04-24 19:13:14.142189295 +0000 UTC m=+358.781309136" watchObservedRunningTime="2026-04-24 19:13:14.142857451 +0000 UTC m=+358.781977315" Apr 24 19:16:30.868909 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.868826 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-sz8wv"] Apr 24 19:16:30.871011 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.870991 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sz8wv" Apr 24 19:16:30.873747 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.873725 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:16:30.874661 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.874643 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 19:16:30.874742 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.874645 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dnd96\"" Apr 24 19:16:30.874742 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.874649 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:16:30.880256 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.880235 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sz8wv"] Apr 24 19:16:30.986918 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:30.986894 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fchw\" (UniqueName: \"kubernetes.io/projected/4b84cbeb-f63f-4f1e-acd6-b395d704b97c-kube-api-access-7fchw\") pod \"s3-init-sz8wv\" (UID: \"4b84cbeb-f63f-4f1e-acd6-b395d704b97c\") " pod="kserve/s3-init-sz8wv" Apr 24 19:16:31.087723 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:31.087696 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fchw\" (UniqueName: \"kubernetes.io/projected/4b84cbeb-f63f-4f1e-acd6-b395d704b97c-kube-api-access-7fchw\") pod \"s3-init-sz8wv\" (UID: \"4b84cbeb-f63f-4f1e-acd6-b395d704b97c\") " pod="kserve/s3-init-sz8wv" Apr 24 19:16:31.096290 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:31.096263 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fchw\" (UniqueName: \"kubernetes.io/projected/4b84cbeb-f63f-4f1e-acd6-b395d704b97c-kube-api-access-7fchw\") pod \"s3-init-sz8wv\" (UID: \"4b84cbeb-f63f-4f1e-acd6-b395d704b97c\") " pod="kserve/s3-init-sz8wv" Apr 24 19:16:31.185985 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:31.185918 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sz8wv" Apr 24 19:16:31.300758 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:31.300707 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sz8wv"] Apr 24 19:16:31.303326 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:16:31.303298 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b84cbeb_f63f_4f1e_acd6_b395d704b97c.slice/crio-de4757ea2b7f2d344239522bcfa9a95f3c79dbd0f1fb9db3dd28c07f585d58ef WatchSource:0}: Error finding container de4757ea2b7f2d344239522bcfa9a95f3c79dbd0f1fb9db3dd28c07f585d58ef: Status 404 returned error can't find the container with id de4757ea2b7f2d344239522bcfa9a95f3c79dbd0f1fb9db3dd28c07f585d58ef Apr 24 19:16:31.656170 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:31.656139 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sz8wv" event={"ID":"4b84cbeb-f63f-4f1e-acd6-b395d704b97c","Type":"ContainerStarted","Data":"de4757ea2b7f2d344239522bcfa9a95f3c79dbd0f1fb9db3dd28c07f585d58ef"} Apr 24 19:16:35.672350 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:35.672321 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sz8wv" event={"ID":"4b84cbeb-f63f-4f1e-acd6-b395d704b97c","Type":"ContainerStarted","Data":"126ea02038bc2d1198902f23352f7cad06961eed9f6be85ee5a058078f0f1c92"} Apr 24 19:16:35.687655 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:35.687611 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-sz8wv" podStartSLOduration=1.4034461089999999 podStartE2EDuration="5.687595567s" podCreationTimestamp="2026-04-24 19:16:30 +0000 UTC" firstStartedPulling="2026-04-24 19:16:31.304968994 +0000 UTC m=+555.944088813" lastFinishedPulling="2026-04-24 19:16:35.589118433 +0000 UTC m=+560.228238271" observedRunningTime="2026-04-24 19:16:35.686288425 +0000 UTC m=+560.325408265" watchObservedRunningTime="2026-04-24 19:16:35.687595567 +0000 UTC m=+560.326715408" Apr 24 19:16:38.684527 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:38.684496 2568 generic.go:358] "Generic (PLEG): container finished" podID="4b84cbeb-f63f-4f1e-acd6-b395d704b97c" containerID="126ea02038bc2d1198902f23352f7cad06961eed9f6be85ee5a058078f0f1c92" exitCode=0 Apr 24 19:16:38.684860 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:38.684569 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sz8wv" event={"ID":"4b84cbeb-f63f-4f1e-acd6-b395d704b97c","Type":"ContainerDied","Data":"126ea02038bc2d1198902f23352f7cad06961eed9f6be85ee5a058078f0f1c92"} Apr 24 19:16:39.809463 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:39.809434 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sz8wv" Apr 24 19:16:39.866053 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:39.866027 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fchw\" (UniqueName: \"kubernetes.io/projected/4b84cbeb-f63f-4f1e-acd6-b395d704b97c-kube-api-access-7fchw\") pod \"4b84cbeb-f63f-4f1e-acd6-b395d704b97c\" (UID: \"4b84cbeb-f63f-4f1e-acd6-b395d704b97c\") " Apr 24 19:16:39.868057 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:39.868034 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b84cbeb-f63f-4f1e-acd6-b395d704b97c-kube-api-access-7fchw" (OuterVolumeSpecName: "kube-api-access-7fchw") pod "4b84cbeb-f63f-4f1e-acd6-b395d704b97c" (UID: "4b84cbeb-f63f-4f1e-acd6-b395d704b97c"). InnerVolumeSpecName "kube-api-access-7fchw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:16:39.967552 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:39.967494 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7fchw\" (UniqueName: \"kubernetes.io/projected/4b84cbeb-f63f-4f1e-acd6-b395d704b97c-kube-api-access-7fchw\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:16:40.692331 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:40.692300 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sz8wv" event={"ID":"4b84cbeb-f63f-4f1e-acd6-b395d704b97c","Type":"ContainerDied","Data":"de4757ea2b7f2d344239522bcfa9a95f3c79dbd0f1fb9db3dd28c07f585d58ef"} Apr 24 19:16:40.692482 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:40.692336 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de4757ea2b7f2d344239522bcfa9a95f3c79dbd0f1fb9db3dd28c07f585d58ef" Apr 24 19:16:40.692482 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:40.692341 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sz8wv" Apr 24 19:16:48.087654 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.087620 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-pn4jd"] Apr 24 19:16:48.088025 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.087912 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b84cbeb-f63f-4f1e-acd6-b395d704b97c" containerName="s3-init" Apr 24 19:16:48.088025 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.087924 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b84cbeb-f63f-4f1e-acd6-b395d704b97c" containerName="s3-init" Apr 24 19:16:48.088025 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.087987 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b84cbeb-f63f-4f1e-acd6-b395d704b97c" containerName="s3-init" Apr 24 19:16:48.090004 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.089988 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pn4jd" Apr 24 19:16:48.092290 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.092266 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 19:16:48.092537 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.092521 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:16:48.093205 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.093187 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dnd96\"" Apr 24 19:16:48.093254 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.093206 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:16:48.097032 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.096776 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-pn4jd"] Apr 24 19:16:48.225291 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.225266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znhnl\" (UniqueName: \"kubernetes.io/projected/f11b0545-1430-4e9b-b7d6-54352ffaa24c-kube-api-access-znhnl\") pod \"s3-tls-init-custom-pn4jd\" (UID: \"f11b0545-1430-4e9b-b7d6-54352ffaa24c\") " pod="kserve/s3-tls-init-custom-pn4jd" Apr 24 19:16:48.326338 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.326313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znhnl\" (UniqueName: \"kubernetes.io/projected/f11b0545-1430-4e9b-b7d6-54352ffaa24c-kube-api-access-znhnl\") pod \"s3-tls-init-custom-pn4jd\" (UID: \"f11b0545-1430-4e9b-b7d6-54352ffaa24c\") " pod="kserve/s3-tls-init-custom-pn4jd" Apr 24 19:16:48.334603 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.334579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znhnl\" (UniqueName: \"kubernetes.io/projected/f11b0545-1430-4e9b-b7d6-54352ffaa24c-kube-api-access-znhnl\") pod \"s3-tls-init-custom-pn4jd\" (UID: \"f11b0545-1430-4e9b-b7d6-54352ffaa24c\") " pod="kserve/s3-tls-init-custom-pn4jd" Apr 24 19:16:48.412236 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.412182 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pn4jd" Apr 24 19:16:48.525408 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.525379 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-pn4jd"] Apr 24 19:16:48.529230 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:16:48.529200 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf11b0545_1430_4e9b_b7d6_54352ffaa24c.slice/crio-cc047ccc070d04fe2f8cd4373cd0f9d5af4fdceb775ce33e310e20edd053314b WatchSource:0}: Error finding container cc047ccc070d04fe2f8cd4373cd0f9d5af4fdceb775ce33e310e20edd053314b: Status 404 returned error can't find the container with id cc047ccc070d04fe2f8cd4373cd0f9d5af4fdceb775ce33e310e20edd053314b Apr 24 19:16:48.716875 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.716799 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pn4jd" event={"ID":"f11b0545-1430-4e9b-b7d6-54352ffaa24c","Type":"ContainerStarted","Data":"2c96fcdfedddad46b1c16c5f70ed991ca19c080a340127b34e7709cae375155d"} Apr 24 19:16:48.716875 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.716833 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pn4jd" event={"ID":"f11b0545-1430-4e9b-b7d6-54352ffaa24c","Type":"ContainerStarted","Data":"cc047ccc070d04fe2f8cd4373cd0f9d5af4fdceb775ce33e310e20edd053314b"} Apr 24 19:16:48.731577 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:48.731529 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-pn4jd" podStartSLOduration=0.731513492 podStartE2EDuration="731.513492ms" podCreationTimestamp="2026-04-24 19:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:16:48.731068674 +0000 UTC m=+573.370188519" watchObservedRunningTime="2026-04-24 19:16:48.731513492 +0000 UTC m=+573.370633330" Apr 24 19:16:53.733000 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:53.732970 2568 generic.go:358] "Generic (PLEG): container finished" podID="f11b0545-1430-4e9b-b7d6-54352ffaa24c" containerID="2c96fcdfedddad46b1c16c5f70ed991ca19c080a340127b34e7709cae375155d" exitCode=0 Apr 24 19:16:53.733377 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:53.733047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pn4jd" event={"ID":"f11b0545-1430-4e9b-b7d6-54352ffaa24c","Type":"ContainerDied","Data":"2c96fcdfedddad46b1c16c5f70ed991ca19c080a340127b34e7709cae375155d"} Apr 24 19:16:54.860548 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:54.860528 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pn4jd" Apr 24 19:16:54.977942 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:54.977915 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znhnl\" (UniqueName: \"kubernetes.io/projected/f11b0545-1430-4e9b-b7d6-54352ffaa24c-kube-api-access-znhnl\") pod \"f11b0545-1430-4e9b-b7d6-54352ffaa24c\" (UID: \"f11b0545-1430-4e9b-b7d6-54352ffaa24c\") " Apr 24 19:16:54.979890 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:54.979860 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11b0545-1430-4e9b-b7d6-54352ffaa24c-kube-api-access-znhnl" (OuterVolumeSpecName: "kube-api-access-znhnl") pod "f11b0545-1430-4e9b-b7d6-54352ffaa24c" (UID: "f11b0545-1430-4e9b-b7d6-54352ffaa24c"). InnerVolumeSpecName "kube-api-access-znhnl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:16:55.078913 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:55.078851 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znhnl\" (UniqueName: \"kubernetes.io/projected/f11b0545-1430-4e9b-b7d6-54352ffaa24c-kube-api-access-znhnl\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:16:55.740345 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:55.740312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pn4jd" event={"ID":"f11b0545-1430-4e9b-b7d6-54352ffaa24c","Type":"ContainerDied","Data":"cc047ccc070d04fe2f8cd4373cd0f9d5af4fdceb775ce33e310e20edd053314b"} Apr 24 19:16:55.740490 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:55.740350 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc047ccc070d04fe2f8cd4373cd0f9d5af4fdceb775ce33e310e20edd053314b" Apr 24 19:16:55.740490 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:55.740329 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pn4jd" Apr 24 19:16:58.330927 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.330897 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-w4k79"] Apr 24 19:16:58.331306 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.331223 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f11b0545-1430-4e9b-b7d6-54352ffaa24c" containerName="s3-tls-init-custom" Apr 24 19:16:58.331306 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.331235 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11b0545-1430-4e9b-b7d6-54352ffaa24c" containerName="s3-tls-init-custom" Apr 24 19:16:58.331306 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.331285 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f11b0545-1430-4e9b-b7d6-54352ffaa24c" containerName="s3-tls-init-custom" Apr 24 19:16:58.334199 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.334183 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-w4k79" Apr 24 19:16:58.336707 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.336683 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dnd96\"" Apr 24 19:16:58.336707 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.336697 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 19:16:58.337651 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.337632 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:16:58.337651 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.337633 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:16:58.340399 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.340367 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-w4k79"] Apr 24 19:16:58.402854 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.402832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2fj\" (UniqueName: \"kubernetes.io/projected/76eff44f-2c43-44dc-9ee7-d0823ac63817-kube-api-access-9k2fj\") pod \"s3-tls-init-serving-w4k79\" (UID: \"76eff44f-2c43-44dc-9ee7-d0823ac63817\") " pod="kserve/s3-tls-init-serving-w4k79" Apr 24 19:16:58.504113 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.504088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2fj\" (UniqueName: \"kubernetes.io/projected/76eff44f-2c43-44dc-9ee7-d0823ac63817-kube-api-access-9k2fj\") pod \"s3-tls-init-serving-w4k79\" (UID: \"76eff44f-2c43-44dc-9ee7-d0823ac63817\") " pod="kserve/s3-tls-init-serving-w4k79" Apr 24 19:16:58.511844 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.511822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2fj\" (UniqueName: \"kubernetes.io/projected/76eff44f-2c43-44dc-9ee7-d0823ac63817-kube-api-access-9k2fj\") pod \"s3-tls-init-serving-w4k79\" (UID: \"76eff44f-2c43-44dc-9ee7-d0823ac63817\") " pod="kserve/s3-tls-init-serving-w4k79" Apr 24 19:16:58.655779 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.655725 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-w4k79" Apr 24 19:16:58.776314 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:58.776291 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-w4k79"] Apr 24 19:16:58.778688 ip-10-0-129-23 kubenswrapper[2568]: W0424 19:16:58.778660 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76eff44f_2c43_44dc_9ee7_d0823ac63817.slice/crio-076ee9cc5b9349efb155337c5568ea7c8a02465a8cbfa1ea2ac49344a8fb81d1 WatchSource:0}: Error finding container 076ee9cc5b9349efb155337c5568ea7c8a02465a8cbfa1ea2ac49344a8fb81d1: Status 404 returned error can't find the container with id 076ee9cc5b9349efb155337c5568ea7c8a02465a8cbfa1ea2ac49344a8fb81d1 Apr 24 19:16:59.753735 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:59.753694 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-w4k79" event={"ID":"76eff44f-2c43-44dc-9ee7-d0823ac63817","Type":"ContainerStarted","Data":"3c60a0b1e1b3f70594501275f96eb51b13dffa19602b2c7c55126978920e625f"} Apr 24 19:16:59.753735 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:59.753736 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-w4k79" event={"ID":"76eff44f-2c43-44dc-9ee7-d0823ac63817","Type":"ContainerStarted","Data":"076ee9cc5b9349efb155337c5568ea7c8a02465a8cbfa1ea2ac49344a8fb81d1"} Apr 24 19:16:59.769643 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:16:59.769594 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-w4k79" podStartSLOduration=1.769577275 podStartE2EDuration="1.769577275s" podCreationTimestamp="2026-04-24 19:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:16:59.769099839 +0000 UTC m=+584.408219680" watchObservedRunningTime="2026-04-24 19:16:59.769577275 +0000 UTC m=+584.408697116" Apr 24 19:17:04.769422 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:04.769390 2568 generic.go:358] "Generic (PLEG): container finished" podID="76eff44f-2c43-44dc-9ee7-d0823ac63817" containerID="3c60a0b1e1b3f70594501275f96eb51b13dffa19602b2c7c55126978920e625f" exitCode=0 Apr 24 19:17:04.769907 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:04.769431 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-w4k79" event={"ID":"76eff44f-2c43-44dc-9ee7-d0823ac63817","Type":"ContainerDied","Data":"3c60a0b1e1b3f70594501275f96eb51b13dffa19602b2c7c55126978920e625f"} Apr 24 19:17:05.896798 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:05.896775 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-w4k79" Apr 24 19:17:06.065028 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:06.064946 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k2fj\" (UniqueName: \"kubernetes.io/projected/76eff44f-2c43-44dc-9ee7-d0823ac63817-kube-api-access-9k2fj\") pod \"76eff44f-2c43-44dc-9ee7-d0823ac63817\" (UID: \"76eff44f-2c43-44dc-9ee7-d0823ac63817\") " Apr 24 19:17:06.067080 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:06.067057 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76eff44f-2c43-44dc-9ee7-d0823ac63817-kube-api-access-9k2fj" (OuterVolumeSpecName: "kube-api-access-9k2fj") pod "76eff44f-2c43-44dc-9ee7-d0823ac63817" (UID: "76eff44f-2c43-44dc-9ee7-d0823ac63817"). InnerVolumeSpecName "kube-api-access-9k2fj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:17:06.166204 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:06.166170 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9k2fj\" (UniqueName: \"kubernetes.io/projected/76eff44f-2c43-44dc-9ee7-d0823ac63817-kube-api-access-9k2fj\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 19:17:06.775586 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:06.775551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-w4k79" event={"ID":"76eff44f-2c43-44dc-9ee7-d0823ac63817","Type":"ContainerDied","Data":"076ee9cc5b9349efb155337c5568ea7c8a02465a8cbfa1ea2ac49344a8fb81d1"} Apr 24 19:17:06.775586 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:06.775584 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="076ee9cc5b9349efb155337c5568ea7c8a02465a8cbfa1ea2ac49344a8fb81d1" Apr 24 19:17:06.775586 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:06.775567 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-w4k79" Apr 24 19:17:15.893495 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:15.893466 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:17:15.894421 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:17:15.894396 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:22:15.914609 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:22:15.914581 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:22:15.916612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:22:15.916588 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:27:15.943379 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:27:15.943341 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:27:15.946005 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:27:15.945980 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:32:15.964319 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:32:15.964293 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:32:15.966217 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:32:15.966198 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:37:15.989822 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:37:15.989797 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:37:15.992299 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:37:15.992275 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:42:16.012725 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:42:16.012694 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:42:16.015770 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:42:16.015745 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:47:16.036067 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:47:16.036040 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:47:16.037926 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:47:16.037899 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:52:16.058518 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:52:16.058489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:52:16.061204 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:52:16.061184 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:57:16.078612 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:57:16.078585 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 19:57:16.082709 ip-10-0-129-23 kubenswrapper[2568]: I0424 19:57:16.082690 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:02:16.098795 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:02:16.098768 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:02:16.102633 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:02:16.102610 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:07:16.119367 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:07:16.119336 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:07:16.123333 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:07:16.123309 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:12:16.139734 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:12:16.139708 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:12:16.144192 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:12:16.144170 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:13:03.841171 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.841133 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-whsnq/must-gather-hl5ln"] Apr 24 20:13:03.841598 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.841494 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76eff44f-2c43-44dc-9ee7-d0823ac63817" containerName="s3-tls-init-serving" Apr 24 20:13:03.841598 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.841510 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="76eff44f-2c43-44dc-9ee7-d0823ac63817" containerName="s3-tls-init-serving" Apr 24 20:13:03.841598 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.841564 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="76eff44f-2c43-44dc-9ee7-d0823ac63817" containerName="s3-tls-init-serving" Apr 24 20:13:03.844665 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.844645 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:03.846755 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.846734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-whsnq\"/\"kube-root-ca.crt\"" Apr 24 20:13:03.846946 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.846930 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-whsnq\"/\"openshift-service-ca.crt\"" Apr 24 20:13:03.853037 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.853015 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-whsnq/must-gather-hl5ln"] Apr 24 20:13:03.971249 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.971212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/176264cd-8699-44b3-a7bc-9d50be90adb1-must-gather-output\") pod \"must-gather-hl5ln\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:03.971430 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:03.971268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc9r7\" (UniqueName: \"kubernetes.io/projected/176264cd-8699-44b3-a7bc-9d50be90adb1-kube-api-access-qc9r7\") pod \"must-gather-hl5ln\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:04.071879 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.071841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/176264cd-8699-44b3-a7bc-9d50be90adb1-must-gather-output\") pod \"must-gather-hl5ln\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:04.072077 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.071902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc9r7\" (UniqueName: \"kubernetes.io/projected/176264cd-8699-44b3-a7bc-9d50be90adb1-kube-api-access-qc9r7\") pod \"must-gather-hl5ln\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:04.072225 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.072205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/176264cd-8699-44b3-a7bc-9d50be90adb1-must-gather-output\") pod \"must-gather-hl5ln\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:04.080622 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.080594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc9r7\" (UniqueName: \"kubernetes.io/projected/176264cd-8699-44b3-a7bc-9d50be90adb1-kube-api-access-qc9r7\") pod \"must-gather-hl5ln\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:04.163052 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.162967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:04.279886 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.279853 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-whsnq/must-gather-hl5ln"] Apr 24 20:13:04.282826 ip-10-0-129-23 kubenswrapper[2568]: W0424 20:13:04.282801 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod176264cd_8699_44b3_a7bc_9d50be90adb1.slice/crio-cbc849447c41d417e014987abd0027180a66e07aaded3030da27b694361049ad WatchSource:0}: Error finding container cbc849447c41d417e014987abd0027180a66e07aaded3030da27b694361049ad: Status 404 returned error can't find the container with id cbc849447c41d417e014987abd0027180a66e07aaded3030da27b694361049ad Apr 24 20:13:04.284532 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.284509 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 20:13:04.399980 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:04.399927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-whsnq/must-gather-hl5ln" event={"ID":"176264cd-8699-44b3-a7bc-9d50be90adb1","Type":"ContainerStarted","Data":"cbc849447c41d417e014987abd0027180a66e07aaded3030da27b694361049ad"} Apr 24 20:13:09.418102 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:09.418066 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-whsnq/must-gather-hl5ln" event={"ID":"176264cd-8699-44b3-a7bc-9d50be90adb1","Type":"ContainerStarted","Data":"021fcd9b5990fd451742fd9b8f3af3089f484665f700d7cbf76eec9cf6a5e488"} Apr 24 20:13:09.418102 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:09.418106 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-whsnq/must-gather-hl5ln" event={"ID":"176264cd-8699-44b3-a7bc-9d50be90adb1","Type":"ContainerStarted","Data":"7e968852613e6a28ac29d2d39f212ee9d544ea2a6a9f2a4e299ed5ec7148e62c"} Apr 24 20:13:09.433868 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:09.433794 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-whsnq/must-gather-hl5ln" podStartSLOduration=1.755611644 podStartE2EDuration="6.433775905s" podCreationTimestamp="2026-04-24 20:13:03 +0000 UTC" firstStartedPulling="2026-04-24 20:13:04.284654595 +0000 UTC m=+3948.923774415" lastFinishedPulling="2026-04-24 20:13:08.962818856 +0000 UTC m=+3953.601938676" observedRunningTime="2026-04-24 20:13:09.432475 +0000 UTC m=+3954.071594853" watchObservedRunningTime="2026-04-24 20:13:09.433775905 +0000 UTC m=+3954.072895746" Apr 24 20:13:30.486687 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:30.486653 2568 generic.go:358] "Generic (PLEG): container finished" podID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerID="7e968852613e6a28ac29d2d39f212ee9d544ea2a6a9f2a4e299ed5ec7148e62c" exitCode=0 Apr 24 20:13:30.487131 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:30.486733 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-whsnq/must-gather-hl5ln" event={"ID":"176264cd-8699-44b3-a7bc-9d50be90adb1","Type":"ContainerDied","Data":"7e968852613e6a28ac29d2d39f212ee9d544ea2a6a9f2a4e299ed5ec7148e62c"} Apr 24 20:13:30.487131 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:30.487082 2568 scope.go:117] "RemoveContainer" containerID="7e968852613e6a28ac29d2d39f212ee9d544ea2a6a9f2a4e299ed5ec7148e62c" Apr 24 20:13:30.861049 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:30.861019 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-whsnq_must-gather-hl5ln_176264cd-8699-44b3-a7bc-9d50be90adb1/gather/0.log" Apr 24 20:13:35.170693 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:35.170654 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7jtcs_c1caa964-4c7b-426a-8598-6c1acbee1107/global-pull-secret-syncer/0.log" Apr 24 20:13:35.429641 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:35.429556 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4f9zh_e3dcdece-1b7a-4952-8ab6-a6e6dc7089e9/konnectivity-agent/0.log" Apr 24 20:13:35.587125 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:35.587092 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-23.ec2.internal_5617684810147cab138b7763a579ba59/haproxy/0.log" Apr 24 20:13:36.325782 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.325743 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-whsnq/must-gather-hl5ln"] Apr 24 20:13:36.326526 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.326038 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-whsnq/must-gather-hl5ln" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerName="copy" containerID="cri-o://021fcd9b5990fd451742fd9b8f3af3089f484665f700d7cbf76eec9cf6a5e488" gracePeriod=2 Apr 24 20:13:36.327521 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.327489 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-whsnq/must-gather-hl5ln"] Apr 24 20:13:36.328013 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.327982 2568 status_manager.go:895] "Failed to get status for pod" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" pod="openshift-must-gather-whsnq/must-gather-hl5ln" err="pods \"must-gather-hl5ln\" is forbidden: User \"system:node:ip-10-0-129-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-whsnq\": no relationship found between node 'ip-10-0-129-23.ec2.internal' and this object" Apr 24 20:13:36.504797 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.504770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-whsnq_must-gather-hl5ln_176264cd-8699-44b3-a7bc-9d50be90adb1/copy/0.log" Apr 24 20:13:36.505160 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.505140 2568 generic.go:358] "Generic (PLEG): container finished" podID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerID="021fcd9b5990fd451742fd9b8f3af3089f484665f700d7cbf76eec9cf6a5e488" exitCode=143 Apr 24 20:13:36.548909 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.548880 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-whsnq_must-gather-hl5ln_176264cd-8699-44b3-a7bc-9d50be90adb1/copy/0.log" Apr 24 20:13:36.549250 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.549235 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:36.551199 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.551170 2568 status_manager.go:895] "Failed to get status for pod" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" pod="openshift-must-gather-whsnq/must-gather-hl5ln" err="pods \"must-gather-hl5ln\" is forbidden: User \"system:node:ip-10-0-129-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-whsnq\": no relationship found between node 'ip-10-0-129-23.ec2.internal' and this object" Apr 24 20:13:36.648183 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.648160 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/176264cd-8699-44b3-a7bc-9d50be90adb1-must-gather-output\") pod \"176264cd-8699-44b3-a7bc-9d50be90adb1\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " Apr 24 20:13:36.648348 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.648261 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc9r7\" (UniqueName: \"kubernetes.io/projected/176264cd-8699-44b3-a7bc-9d50be90adb1-kube-api-access-qc9r7\") pod \"176264cd-8699-44b3-a7bc-9d50be90adb1\" (UID: \"176264cd-8699-44b3-a7bc-9d50be90adb1\") " Apr 24 20:13:36.649535 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.649490 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/176264cd-8699-44b3-a7bc-9d50be90adb1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "176264cd-8699-44b3-a7bc-9d50be90adb1" (UID: "176264cd-8699-44b3-a7bc-9d50be90adb1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:13:36.650497 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.650471 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/176264cd-8699-44b3-a7bc-9d50be90adb1-kube-api-access-qc9r7" (OuterVolumeSpecName: "kube-api-access-qc9r7") pod "176264cd-8699-44b3-a7bc-9d50be90adb1" (UID: "176264cd-8699-44b3-a7bc-9d50be90adb1"). InnerVolumeSpecName "kube-api-access-qc9r7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:13:36.749342 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.749307 2568 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/176264cd-8699-44b3-a7bc-9d50be90adb1-must-gather-output\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 20:13:36.749342 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:36.749335 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qc9r7\" (UniqueName: \"kubernetes.io/projected/176264cd-8699-44b3-a7bc-9d50be90adb1-kube-api-access-qc9r7\") on node \"ip-10-0-129-23.ec2.internal\" DevicePath \"\"" Apr 24 20:13:37.509267 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:37.509240 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-whsnq_must-gather-hl5ln_176264cd-8699-44b3-a7bc-9d50be90adb1/copy/0.log" Apr 24 20:13:37.509716 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:37.509617 2568 scope.go:117] "RemoveContainer" containerID="021fcd9b5990fd451742fd9b8f3af3089f484665f700d7cbf76eec9cf6a5e488" Apr 24 20:13:37.509716 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:37.509621 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-whsnq/must-gather-hl5ln" Apr 24 20:13:37.511825 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:37.511798 2568 status_manager.go:895] "Failed to get status for pod" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" pod="openshift-must-gather-whsnq/must-gather-hl5ln" err="pods \"must-gather-hl5ln\" is forbidden: User \"system:node:ip-10-0-129-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-whsnq\": no relationship found between node 'ip-10-0-129-23.ec2.internal' and this object" Apr 24 20:13:37.517184 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:37.516817 2568 scope.go:117] "RemoveContainer" containerID="7e968852613e6a28ac29d2d39f212ee9d544ea2a6a9f2a4e299ed5ec7148e62c" Apr 24 20:13:37.519470 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:37.519448 2568 status_manager.go:895] "Failed to get status for pod" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" pod="openshift-must-gather-whsnq/must-gather-hl5ln" err="pods \"must-gather-hl5ln\" is forbidden: User \"system:node:ip-10-0-129-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-whsnq\": no relationship found between node 'ip-10-0-129-23.ec2.internal' and this object" Apr 24 20:13:37.983426 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:37.983393 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" path="/var/lib/kubelet/pods/176264cd-8699-44b3-a7bc-9d50be90adb1/volumes" Apr 24 20:13:39.173852 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.173768 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-vkb2m_cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa/kube-state-metrics/0.log" Apr 24 20:13:39.196254 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.196226 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-vkb2m_cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa/kube-rbac-proxy-main/0.log" Apr 24 20:13:39.217569 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.217541 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-vkb2m_cdef82cb-4f2d-4d39-99ad-0c822bb4a6fa/kube-rbac-proxy-self/0.log" Apr 24 20:13:39.380921 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.380883 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9tscd_53337fb5-281e-43b2-ac91-aa328517d13c/node-exporter/0.log" Apr 24 20:13:39.402566 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.402542 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9tscd_53337fb5-281e-43b2-ac91-aa328517d13c/kube-rbac-proxy/0.log" Apr 24 20:13:39.421786 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.421768 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9tscd_53337fb5-281e-43b2-ac91-aa328517d13c/init-textfile/0.log" Apr 24 20:13:39.630203 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.630174 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_01e4c761-3041-405b-9afb-cbcf743364a1/prometheus/0.log" Apr 24 20:13:39.649141 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.649110 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_01e4c761-3041-405b-9afb-cbcf743364a1/config-reloader/0.log" Apr 24 20:13:39.672475 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.672443 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_01e4c761-3041-405b-9afb-cbcf743364a1/thanos-sidecar/0.log" Apr 24 20:13:39.694390 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.694364 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_01e4c761-3041-405b-9afb-cbcf743364a1/kube-rbac-proxy-web/0.log" Apr 24 20:13:39.717579 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.717552 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_01e4c761-3041-405b-9afb-cbcf743364a1/kube-rbac-proxy/0.log" Apr 24 20:13:39.740203 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.740177 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_01e4c761-3041-405b-9afb-cbcf743364a1/kube-rbac-proxy-thanos/0.log" Apr 24 20:13:39.759850 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.759831 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_01e4c761-3041-405b-9afb-cbcf743364a1/init-config-reloader/0.log" Apr 24 20:13:39.796203 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.796169 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5xljk_2ed520f7-8b69-424e-a4fc-e91657a114ee/prometheus-operator/0.log" Apr 24 20:13:39.818274 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:39.818241 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5xljk_2ed520f7-8b69-424e-a4fc-e91657a114ee/kube-rbac-proxy/0.log" Apr 24 20:13:42.409078 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.409053 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7cjvj_f3634274-2fdb-4eaf-aecc-9c56d2c42a6d/volume-data-source-validator/0.log" Apr 24 20:13:42.650969 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.650927 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw"] Apr 24 20:13:42.651249 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.651236 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerName="gather" Apr 24 20:13:42.651297 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.651251 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerName="gather" Apr 24 20:13:42.651297 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.651269 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerName="copy" Apr 24 20:13:42.651297 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.651274 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerName="copy" Apr 24 20:13:42.651391 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.651322 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerName="gather" Apr 24 20:13:42.651391 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.651329 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="176264cd-8699-44b3-a7bc-9d50be90adb1" containerName="copy" Apr 24 20:13:42.656490 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.656472 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.658534 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.658512 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tkz2z\"/\"default-dockercfg-k94xx\"" Apr 24 20:13:42.658666 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.658601 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tkz2z\"/\"openshift-service-ca.crt\"" Apr 24 20:13:42.659460 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.659412 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tkz2z\"/\"kube-root-ca.crt\"" Apr 24 20:13:42.663460 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.663440 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw"] Apr 24 20:13:42.800310 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.800259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-podres\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.800500 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.800319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-sys\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.800500 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.800344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-proc\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.800500 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.800373 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-lib-modules\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.800500 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.800398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmqlc\" (UniqueName: \"kubernetes.io/projected/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-kube-api-access-dmqlc\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901331 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-sys\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901331 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901340 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-proc\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901558 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901376 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-lib-modules\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901558 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901402 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmqlc\" (UniqueName: \"kubernetes.io/projected/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-kube-api-access-dmqlc\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901558 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901411 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-sys\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901558 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-proc\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901713 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901549 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-podres\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901713 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-podres\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.901713 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.901621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-lib-modules\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.909134 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.909105 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmqlc\" (UniqueName: \"kubernetes.io/projected/7e2199ec-fe03-4e50-8d23-d5bc6ac252a0-kube-api-access-dmqlc\") pod \"perf-node-gather-daemonset-hr2jw\" (UID: \"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:42.967182 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:42.967099 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:43.091708 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.091684 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw"] Apr 24 20:13:43.094138 ip-10-0-129-23 kubenswrapper[2568]: W0424 20:13:43.094111 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7e2199ec_fe03_4e50_8d23_d5bc6ac252a0.slice/crio-9698390dce3ce1459e0ecb3235acfec6f485b3609760321bffe0219e3391d825 WatchSource:0}: Error finding container 9698390dce3ce1459e0ecb3235acfec6f485b3609760321bffe0219e3391d825: Status 404 returned error can't find the container with id 9698390dce3ce1459e0ecb3235acfec6f485b3609760321bffe0219e3391d825 Apr 24 20:13:43.181702 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.181673 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jlkwm_7fae1c6b-197d-49e4-a9eb-9b922eaa6f48/dns/0.log" Apr 24 20:13:43.202755 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.202724 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jlkwm_7fae1c6b-197d-49e4-a9eb-9b922eaa6f48/kube-rbac-proxy/0.log" Apr 24 20:13:43.276661 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.276583 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m6mmt_1e21e827-de03-48d5-b7ca-3a5a1c529873/dns-node-resolver/0.log" Apr 24 20:13:43.534024 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.533927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" event={"ID":"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0","Type":"ContainerStarted","Data":"6d5d058f6758aaf8e08c27a2de7f24e788c4f78033c7ed00f0d54860ded5438a"} Apr 24 20:13:43.534024 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.533987 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" event={"ID":"7e2199ec-fe03-4e50-8d23-d5bc6ac252a0","Type":"ContainerStarted","Data":"9698390dce3ce1459e0ecb3235acfec6f485b3609760321bffe0219e3391d825"} Apr 24 20:13:43.534479 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.534131 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:43.549634 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.549526 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" podStartSLOduration=1.54950868 podStartE2EDuration="1.54950868s" podCreationTimestamp="2026-04-24 20:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:13:43.549401415 +0000 UTC m=+3988.188521257" watchObservedRunningTime="2026-04-24 20:13:43.54950868 +0000 UTC m=+3988.188628519" Apr 24 20:13:43.694809 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.694771 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6686b677f9-mncpn_b7d54045-6d96-4b03-8fa0-b6bab817dab6/registry/0.log" Apr 24 20:13:43.775174 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:43.775145 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vdgfs_4be73708-29e9-4ed5-856c-a07616631d8e/node-ca/0.log" Apr 24 20:13:44.874496 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:44.874465 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zxshv_e703cafc-bfc2-4649-968d-ef6e4318694a/serve-healthcheck-canary/0.log" Apr 24 20:13:45.228368 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:45.228252 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d9pbk_fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa/insights-operator/0.log" Apr 24 20:13:45.229836 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:45.229795 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d9pbk_fc7b8d95-3941-4c11-9c0a-14bfffe3e1fa/insights-operator/1.log" Apr 24 20:13:45.252542 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:45.252518 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9bv9b_c2e7f038-e2aa-4900-9d66-5fb67c767701/kube-rbac-proxy/0.log" Apr 24 20:13:45.273180 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:45.273159 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9bv9b_c2e7f038-e2aa-4900-9d66-5fb67c767701/exporter/0.log" Apr 24 20:13:45.294497 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:45.294472 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9bv9b_c2e7f038-e2aa-4900-9d66-5fb67c767701/extractor/0.log" Apr 24 20:13:48.054673 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:48.054643 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-sz8wv_4b84cbeb-f63f-4f1e-acd6-b395d704b97c/s3-init/0.log" Apr 24 20:13:48.082450 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:48.082414 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-pn4jd_f11b0545-1430-4e9b-b7d6-54352ffaa24c/s3-tls-init-custom/0.log" Apr 24 20:13:48.109428 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:48.109391 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-w4k79_76eff44f-2c43-44dc-9ee7-d0823ac63817/s3-tls-init-serving/0.log" Apr 24 20:13:49.545426 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:49.545398 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-hr2jw" Apr 24 20:13:52.493389 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:52.493351 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dxjdh_c4d6f584-a9a2-4297-9d42-6683202fc40f/kube-storage-version-migrator-operator/1.log" Apr 24 20:13:52.495152 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:52.495124 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dxjdh_c4d6f584-a9a2-4297-9d42-6683202fc40f/kube-storage-version-migrator-operator/0.log" Apr 24 20:13:53.391890 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.391862 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qxnnz_1186bd80-4999-47f8-b309-3246becab924/kube-multus-additional-cni-plugins/0.log" Apr 24 20:13:53.412330 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.412303 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qxnnz_1186bd80-4999-47f8-b309-3246becab924/egress-router-binary-copy/0.log" Apr 24 20:13:53.435718 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.435690 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qxnnz_1186bd80-4999-47f8-b309-3246becab924/cni-plugins/0.log" Apr 24 20:13:53.456045 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.456013 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qxnnz_1186bd80-4999-47f8-b309-3246becab924/bond-cni-plugin/0.log" Apr 24 20:13:53.478662 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.478629 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qxnnz_1186bd80-4999-47f8-b309-3246becab924/routeoverride-cni/0.log" Apr 24 20:13:53.499321 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.499295 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qxnnz_1186bd80-4999-47f8-b309-3246becab924/whereabouts-cni-bincopy/0.log" Apr 24 20:13:53.522025 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.521995 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qxnnz_1186bd80-4999-47f8-b309-3246becab924/whereabouts-cni/0.log" Apr 24 20:13:53.969125 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.969096 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lx4cb_3f9a2db2-5738-4b14-a835-27706918a96e/kube-multus/0.log" Apr 24 20:13:53.988702 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:53.988673 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4lz47_060d8b4b-7fbe-4109-888d-a5c4822cff6e/network-metrics-daemon/0.log" Apr 24 20:13:54.008525 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:54.008498 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4lz47_060d8b4b-7fbe-4109-888d-a5c4822cff6e/kube-rbac-proxy/0.log" Apr 24 20:13:55.554319 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.554291 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-controller/0.log" Apr 24 20:13:55.571081 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.571053 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/0.log" Apr 24 20:13:55.606265 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.606234 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovn-acl-logging/1.log" Apr 24 20:13:55.629425 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.629396 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/kube-rbac-proxy-node/0.log" Apr 24 20:13:55.652009 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.651981 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 20:13:55.668476 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.668441 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/northd/0.log" Apr 24 20:13:55.687676 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.687650 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/nbdb/0.log" Apr 24 20:13:55.707761 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.707733 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/sbdb/0.log" Apr 24 20:13:55.873328 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:55.873295 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj77d_d49788ac-b5cf-4dfb-9670-2385671fc731/ovnkube-controller/0.log" Apr 24 20:13:56.893729 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:56.893698 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b49hn_7c6541c7-0cb3-447d-baaa-7d58f2cba8e2/network-check-target-container/0.log" Apr 24 20:13:57.830908 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:57.830873 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-b2kgr_6923e9e9-0a10-445f-9824-663ad232ab97/iptables-alerter/0.log" Apr 24 20:13:58.526579 ip-10-0-129-23 kubenswrapper[2568]: I0424 20:13:58.526542 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lppw8_ef8e461f-b2c2-42d8-9ae0-451164801b2f/tuned/0.log"