Mar 18 16:42:03.847219 ip-10-0-129-201 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Mar 18 16:42:03.847233 ip-10-0-129-201 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Mar 18 16:42:03.847241 ip-10-0-129-201 systemd[1]: kubelet.service: Failed with result 'resources'. Mar 18 16:42:03.847516 ip-10-0-129-201 systemd[1]: Failed to start Kubernetes Kubelet. Mar 18 16:42:14.084429 ip-10-0-129-201 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Mar 18 16:42:14.084449 ip-10-0-129-201 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 10fcef0ae3364316b84c243d7604ccfb -- Mar 18 16:44:35.766078 ip-10-0-129-201 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:36.247324 ip-10-0-129-201 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:36.247324 ip-10-0-129-201 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:36.247324 ip-10-0-129-201 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:36.247324 ip-10-0-129-201 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:36.247324 ip-10-0-129-201 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:36.250727 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.250599 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:36.255157 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255136 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:36.255157 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255156 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:36.255157 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255160 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255164 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255168 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255171 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255174 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255176 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255179 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255182 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255185 2570 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255188 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255192 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255196 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255200 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255202 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255205 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255208 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255210 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255212 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255215 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:36.255262 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255218 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255221 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255225 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255228 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255231 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255233 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255236 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255239 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255242 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255244 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255247 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255249 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255252 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255254 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255257 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255261 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255264 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255267 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255269 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:36.255728 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255272 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255274 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255277 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255279 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255282 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255285 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255287 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255290 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255292 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255294 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255297 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255299 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255301 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255304 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255307 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255309 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255312 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255314 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255317 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255319 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:36.256188 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255322 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255324 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255327 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255329 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255332 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255334 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255338 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255341 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255345 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255348 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255351 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255353 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255356 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255358 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255360 2570 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255363 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255365 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255368 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255370 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255373 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:36.256715 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255375 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255378 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255382 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255385 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255387 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255390 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255827 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255833 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255836 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255839 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255842 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255844 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255847 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255850 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255852 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255856 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255858 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255861 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255863 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255866 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:36.257200 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255870 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255874 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255876 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255883 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255886 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255889 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255891 2570 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255894 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255897 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255899 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255902 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255905 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255907 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255910 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255913 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255915 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255917 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255920 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255922 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255925 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:36.257661 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255927 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255930 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255932 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255935 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255938 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255940 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255943 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255945 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255948 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255950 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255953 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255956 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255959 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255962 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255965 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255967 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255970 2570 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255972 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255974 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255977 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:36.258187 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255979 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255982 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255984 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255988 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255991 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255994 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255996 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.255999 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256001 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256004 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256006 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256009 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256012 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256016 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256019 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256022 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256024 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256027 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:36.258675 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256029 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256031 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256034 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256037 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256040 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256042 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256046 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256048 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256051 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256054 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256056 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256059 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256061 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.256063 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256816 2570 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256826 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256833 2570 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256837 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256842 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256845 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256849 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:36.259136 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256854 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256858 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256861 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256864 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256867 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256871 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256874 2570 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256877 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256880 2570 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256882 2570 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256885 2570 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256888 2570 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256894 2570 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256897 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256900 2570 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256903 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256906 2570 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256910 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256914 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256917 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256920 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256924 2570 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256927 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256930 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256934 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:36.259662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256937 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256942 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256945 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256947 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256950 2570 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256953 2570 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256956 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256960 2570 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256964 2570 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256967 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256970 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256974 2570 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256978 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256980 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256983 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256986 2570 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256989 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256992 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256995 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.256998 2570 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257001 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257004 2570 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257007 2570 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257011 2570 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257014 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:36.260266 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257017 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257020 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257024 2570 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257027 2570 flags.go:64] FLAG: --help="false" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257031 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257034 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257037 2570 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257040 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257043 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257046 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257049 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257052 2570 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257055 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257058 2570 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257064 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257067 2570 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257070 2570 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257073 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257076 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257079 2570 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257082 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257085 2570 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257088 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257091 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:36.260887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257094 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257100 2570 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257103 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257105 2570 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257108 2570 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257111 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257114 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257117 2570 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257120 2570 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257125 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257128 2570 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257132 2570 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257135 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257139 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257142 2570 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257145 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257148 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257151 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257154 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257162 2570 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257165 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257168 2570 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257173 2570 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:36.261492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257176 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257183 2570 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257186 2570 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257189 2570 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257191 2570 flags.go:64] FLAG: --port="10250" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257194 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257197 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e6897ada62734580" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257200 2570 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257203 2570 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257207 2570 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257210 2570 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257213 2570 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257216 2570 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257219 2570 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257222 2570 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257225 2570 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257229 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257232 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257235 2570 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257238 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257240 2570 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257243 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257247 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257252 2570 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257254 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257257 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:36.262057 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257260 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257263 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257267 2570 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257270 2570 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257273 2570 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257280 2570 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257283 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257286 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257289 2570 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257292 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257298 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257301 2570 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257304 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257309 2570 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257312 2570 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257314 2570 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257317 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257320 2570 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257323 2570 flags.go:64] FLAG: --v="2" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257328 2570 flags.go:64] FLAG: --version="false" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257332 2570 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257336 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.257339 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257444 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:36.262686 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257448 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257451 2570 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257454 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257456 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257459 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257461 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257464 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257466 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257469 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257471 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257474 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257476 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257478 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257482 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257484 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257487 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257489 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257491 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257494 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257496 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:36.263274 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257498 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257501 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257503 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257506 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257509 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257511 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257513 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257516 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257518 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257520 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257523 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257525 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257528 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257530 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257532 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257535 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257537 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257539 2570 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257542 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257544 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:36.263798 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257547 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257549 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257552 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257554 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257557 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257561 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257565 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257568 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257571 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257574 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257577 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257579 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257582 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257584 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257586 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257589 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257591 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257594 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257596 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:36.264273 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257599 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257601 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257604 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257606 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257609 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257611 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257613 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257616 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257618 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257621 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257623 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257626 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257628 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257631 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257634 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257636 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257639 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257641 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257645 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257647 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:36.264852 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257650 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:36.265628 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257652 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:36.265628 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257654 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:36.265628 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257657 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:36.265628 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257659 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:36.265628 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.257663 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:36.265628 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.258266 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:36.266713 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.266674 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:36.266793 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.266718 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:36.266793 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266776 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:36.266793 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266782 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:36.266793 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266786 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:36.266793 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266789 2570 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:36.266793 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266792 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:36.266793 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266794 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266798 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266801 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266803 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266806 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266808 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266811 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266814 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266817 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266819 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266822 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266824 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266827 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266831 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266834 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266837 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266839 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266842 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266844 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:36.266984 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266847 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266849 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266852 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266854 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266858 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266862 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266864 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266867 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266870 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266872 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266875 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266877 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266880 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266883 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266885 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266888 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266891 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266894 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266896 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266899 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:36.267475 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266902 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266904 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266907 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266909 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266912 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266914 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266917 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266921 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266926 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266929 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266932 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266934 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266937 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266939 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266942 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266945 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266947 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266950 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266953 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:36.267973 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266956 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266959 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266962 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266964 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266967 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266969 2570 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266972 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266987 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266991 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266994 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266996 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.266999 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267002 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267006 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267009 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267011 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267014 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267017 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267019 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267022 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:36.268430 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267024 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267027 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267030 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.267035 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267165 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267171 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267174 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267178 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267180 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267183 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267186 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267189 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267191 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267194 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267196 2570 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:36.268919 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267199 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267201 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267204 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267206 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267209 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267212 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267214 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267217 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267220 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267222 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267225 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267228 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267231 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267233 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267236 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267238 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267241 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267243 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267245 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267248 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:36.269278 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267251 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267254 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267256 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267259 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267261 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267265 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267268 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267271 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267274 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267277 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267280 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267282 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267285 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267287 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267290 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267292 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267294 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267298 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267301 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:36.269776 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267304 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267306 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267309 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267311 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267313 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267317 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267319 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267322 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267324 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267327 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267329 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267331 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267334 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267337 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267340 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267343 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267345 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267348 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267350 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267353 2570 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:36.270233 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267355 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267357 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267360 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267363 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267365 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267368 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267370 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267373 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267375 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267377 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267380 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267383 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267385 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267388 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267390 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:36.270693 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:36.267393 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:36.271072 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.267397 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:36.271072 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.268098 2570 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:36.274241 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.274224 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:36.275182 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.275169 2570 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:36.275298 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.275278 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:36.275335 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.275319 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:36.302147 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.302122 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:36.304746 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.304722 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:36.318672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.318647 2570 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:36.324148 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.324126 2570 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:36.325999 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.325979 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:36.329236 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.329208 2570 fs.go:135] Filesystem UUIDs: map[38841edc-a795-4a0c-a1db-83010ff210b8:/dev/nvme0n1p4 5c9c429a-9ea2-44ed-8579-0fcd0e384532:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Mar 18 16:44:36.329330 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.329232 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:36.334067 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.334045 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:36.335068 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.334930 2570 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:36.333416133 +0000 UTC m=+0.439067485 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3204358 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2feb9c5a1d1e98516b73a7a1da1291 SystemUUID:ec2feb9c-5a1d-1e98-516b-73a7a1da1291 BootID:10fcef0a-e336-4316-b84c-243d7604ccfb Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:20:6d:86:29:b7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:20:6d:86:29:b7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:82:a0:2e:f6:50 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:36.335138 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.335070 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:36.335187 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.335173 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:36.340263 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.340225 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:36.340411 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.340262 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-201.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:36.340461 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.340421 2570 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:36.340461 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.340431 2570 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:36.340461 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.340444 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:36.341215 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.341204 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:36.342760 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.342748 2570 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:36.342880 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.342870 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:36.345994 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.345981 2570 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:36.346041 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.345999 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:36.346041 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.346019 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:36.346041 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.346032 2570 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:36.346123 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.346042 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:36.347752 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.347736 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:36.347814 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.347757 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:36.351309 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.351288 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:36.353624 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.353604 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:36.355167 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355152 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355172 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355181 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355191 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355202 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355210 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355221 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355228 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355238 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355247 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:36.355269 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355267 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:36.355549 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.355280 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:36.356169 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.356145 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:44:36.356211 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.356177 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:44:36.357342 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.357330 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:36.357398 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.357347 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:36.360418 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.360393 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nxtsr" Mar 18 16:44:36.361033 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.361018 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:36.361119 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.361066 2570 server.go:1295] "Started kubelet" Mar 18 16:44:36.361199 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.361153 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:36.361306 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.361238 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:36.361416 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.361325 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:36.362086 ip-10-0-129-201 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:36.362859 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.362794 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:36.363689 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.363675 2570 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:36.366399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.366379 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nxtsr" Mar 18 16:44:36.367746 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.367725 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-201.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:44:36.368867 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.367693 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-201.ec2.internal.189dfd3f4084766b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-201.ec2.internal,UID:ip-10-0-129-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-201.ec2.internal,},FirstTimestamp:2026-03-18 16:44:36.361033323 +0000 UTC m=+0.466684682,LastTimestamp:2026-03-18 16:44:36.361033323 +0000 UTC m=+0.466684682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-201.ec2.internal,}" Mar 18 16:44:36.369017 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.368996 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:36.369591 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.369575 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:36.370329 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370311 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:36.370493 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370469 2570 factory.go:55] Registering systemd factory Mar 18 16:44:36.370493 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370493 2570 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:36.370803 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370760 2570 factory.go:153] Registering CRI-O factory Mar 18 16:44:36.370803 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370776 2570 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:36.370901 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370847 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:36.370901 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370892 2570 factory.go:103] Registering Raw factory Mar 18 16:44:36.370992 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370910 2570 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:36.371106 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.371086 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:36.371290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.370314 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:36.371407 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.371392 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:36.371633 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.371620 2570 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:36.371633 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.371627 2570 manager.go:319] Starting recovery of all containers Mar 18 16:44:36.371633 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.371634 2570 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:36.373256 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.373228 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:44:36.380371 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.380332 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:36.382776 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.382602 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-201.ec2.internal\" not found" node="ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.385268 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.385250 2570 manager.go:324] Recovery completed Mar 18 16:44:36.386904 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.386878 2570 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Mar 18 16:44:36.390122 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.390109 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:36.392669 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.392651 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:36.392839 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.392688 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:36.392839 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.392719 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:36.393267 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.393254 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:36.393317 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.393266 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:36.393317 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.393288 2570 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:36.394571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.394559 2570 policy_none.go:49] "None policy: Start" Mar 18 16:44:36.394628 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.394576 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:36.394628 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.394585 2570 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:36.436433 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.436413 2570 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.436481 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.436495 2570 server.go:85] "Starting device plugin registration server" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.436813 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.436836 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.436914 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.436995 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.437002 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.437955 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:36.453910 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.437995 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:36.512849 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.512757 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:36.514190 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.514172 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:36.514277 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.514204 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:36.514277 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.514228 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:36.514277 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.514236 2570 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:36.514403 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.514279 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:36.517336 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.517256 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:36.537717 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.537669 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:36.538671 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.538650 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:36.538782 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.538683 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:36.538782 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.538715 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:36.538782 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.538761 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.546942 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.546916 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.547020 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.546944 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-201.ec2.internal\": node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:36.555597 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.555574 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:36.615427 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.615398 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal"] Mar 18 16:44:36.615514 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.615489 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:36.617114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.617095 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:36.617185 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.617129 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:36.617185 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.617142 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:36.618444 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.618429 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:36.618536 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.618520 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.618590 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.618552 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:36.619469 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.619455 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:36.619554 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.619483 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:36.619554 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.619503 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:36.619554 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.619458 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:36.619653 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.619563 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:36.619653 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.619575 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:36.620877 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.620862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.620928 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.620896 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:36.621657 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.621640 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:36.621724 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.621674 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:36.621724 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.621689 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:36.647624 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.647597 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-201.ec2.internal\" not found" node="ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.652387 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.652366 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-201.ec2.internal\" not found" node="ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.656424 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.656404 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:36.672662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.672633 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a62256ddb99a5d51981b06e48f8ed26-config\") pod \"kube-apiserver-proxy-ip-10-0-129-201.ec2.internal\" (UID: \"7a62256ddb99a5d51981b06e48f8ed26\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.672662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.672665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f6cf386473d7d5f0f36c10779f62f49b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal\" (UID: \"f6cf386473d7d5f0f36c10779f62f49b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.672859 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.672732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6cf386473d7d5f0f36c10779f62f49b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal\" (UID: \"f6cf386473d7d5f0f36c10779f62f49b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.757412 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.757380 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:36.773812 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.773754 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a62256ddb99a5d51981b06e48f8ed26-config\") pod \"kube-apiserver-proxy-ip-10-0-129-201.ec2.internal\" (UID: \"7a62256ddb99a5d51981b06e48f8ed26\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.773812 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.773786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f6cf386473d7d5f0f36c10779f62f49b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal\" (UID: \"f6cf386473d7d5f0f36c10779f62f49b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.773812 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.773812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6cf386473d7d5f0f36c10779f62f49b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal\" (UID: \"f6cf386473d7d5f0f36c10779f62f49b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.773965 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.773858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6cf386473d7d5f0f36c10779f62f49b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal\" (UID: \"f6cf386473d7d5f0f36c10779f62f49b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.773965 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.773864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f6cf386473d7d5f0f36c10779f62f49b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal\" (UID: \"f6cf386473d7d5f0f36c10779f62f49b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.773965 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.773865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a62256ddb99a5d51981b06e48f8ed26-config\") pod \"kube-apiserver-proxy-ip-10-0-129-201.ec2.internal\" (UID: \"7a62256ddb99a5d51981b06e48f8ed26\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.858234 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.858202 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:36.950835 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.950799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.953504 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:36.953480 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:36.959295 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:36.959270 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:37.059944 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:37.059854 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:37.160508 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:37.160464 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:37.261022 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:37.260983 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-201.ec2.internal\" not found" Mar 18 16:44:37.275426 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.275399 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:37.275603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.275583 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:37.275656 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.275615 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:37.358119 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.358054 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:37.369478 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.369451 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:37.369478 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.369467 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:36 +0000 UTC" deadline="2027-12-14 12:05:02.14628563 +0000 UTC" Mar 18 16:44:37.369662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.369493 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15259h20m24.776795926s" Mar 18 16:44:37.369909 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.369894 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" Mar 18 16:44:37.378158 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.378133 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:37.380222 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.380187 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" Mar 18 16:44:37.381072 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.381052 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:37.389146 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.389124 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:37.407887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.407838 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6wl2s" Mar 18 16:44:37.416359 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.416330 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6wl2s" Mar 18 16:44:37.487503 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:37.487464 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a62256ddb99a5d51981b06e48f8ed26.slice/crio-0b15af7a2ef6487b3b56d914b41ee171eeb3281337349e8fad150a94b8735f35 WatchSource:0}: Error finding container 0b15af7a2ef6487b3b56d914b41ee171eeb3281337349e8fad150a94b8735f35: Status 404 returned error can't find the container with id 0b15af7a2ef6487b3b56d914b41ee171eeb3281337349e8fad150a94b8735f35 Mar 18 16:44:37.487971 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:37.487947 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6cf386473d7d5f0f36c10779f62f49b.slice/crio-01f496db8fa230eea98f98452abc91c2943aa762edc5d448ef85f384e53c6283 WatchSource:0}: Error finding container 01f496db8fa230eea98f98452abc91c2943aa762edc5d448ef85f384e53c6283: Status 404 returned error can't find the container with id 01f496db8fa230eea98f98452abc91c2943aa762edc5d448ef85f384e53c6283 Mar 18 16:44:37.493031 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.493015 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:37.517327 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.517267 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" event={"ID":"f6cf386473d7d5f0f36c10779f62f49b","Type":"ContainerStarted","Data":"01f496db8fa230eea98f98452abc91c2943aa762edc5d448ef85f384e53c6283"} Mar 18 16:44:37.518260 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.518234 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" event={"ID":"7a62256ddb99a5d51981b06e48f8ed26","Type":"ContainerStarted","Data":"0b15af7a2ef6487b3b56d914b41ee171eeb3281337349e8fad150a94b8735f35"} Mar 18 16:44:37.931719 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:37.931671 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:38.347304 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.347220 2570 apiserver.go:52] "Watching apiserver" Mar 18 16:44:38.353079 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.353045 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:38.353429 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.353403 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ktpw5","kube-system/konnectivity-agent-f4pdp","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal","openshift-multus/multus-additional-cni-plugins-ptdgh","openshift-network-diagnostics/network-check-target-x724s","openshift-network-operator/iptables-alerter-9bgx9","openshift-ovn-kubernetes/ovnkube-node-gk7ln","kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal","openshift-cluster-node-tuning-operator/tuned-4dfx2","openshift-dns/node-resolver-blj9x","openshift-image-registry/node-ca-8mt8g","openshift-multus/multus-cnpd6"] Mar 18 16:44:38.357472 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.357438 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.359104 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.359079 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.361390 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.360235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.361390 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.360262 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.361390 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.360525 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jlxmp\"" Mar 18 16:44:38.361390 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.361044 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.361664 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.361551 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.361664 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.361575 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:38.362062 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.361916 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.362184 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.362170 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kszpw\"" Mar 18 16:44:38.362932 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.362822 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.363658 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.363629 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:38.363809 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.363786 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2vbv9\"" Mar 18 16:44:38.363882 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.363847 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:38.364240 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.364218 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:38.364351 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.364303 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:38.365263 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.365106 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:38.365263 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.365146 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.365263 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.365217 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t8cpw\"" Mar 18 16:44:38.365661 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.365642 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:38.365873 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.365857 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.365977 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.365958 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:38.370910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.370888 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.371047 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.371034 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.372619 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.372594 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:38.372757 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.372674 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:38.373525 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373496 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.373631 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373540 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vgjtp\"" Mar 18 16:44:38.373631 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373552 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.373785 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373641 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:38.373845 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373807 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:38.373945 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373927 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:38.374006 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373967 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:38.374006 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373927 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.374006 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.373935 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.374251 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.374236 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hhpw6\"" Mar 18 16:44:38.374471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.374454 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:38.375004 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.374978 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.379258 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.378296 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bkbxc\"" Mar 18 16:44:38.379258 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.378695 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.379258 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.378828 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.380938 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.380891 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.381059 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.380895 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.383166 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.383140 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:38.383253 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.383193 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nv6rb\"" Mar 18 16:44:38.383253 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.383224 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.383407 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.383391 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:38.383472 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.383457 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.383524 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.383396 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wld92\"" Mar 18 16:44:38.384033 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a-konnectivity-ca\") pod \"konnectivity-agent-f4pdp\" (UID: \"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a\") " pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.384137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-cni-binary-copy\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.384137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-systemd\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-etc-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384119 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-node-log\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384165 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovnkube-script-lib\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384191 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysctl-d\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384234 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384309 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-modprobe-d\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384328 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysctl-conf\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384373 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lprfs\" (UniqueName: \"kubernetes.io/projected/aa591bdc-c7e1-4131-b41c-dd5043afacbf-kube-api-access-lprfs\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-run-netns\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384415 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-var-lib-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-sys-fs\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384446 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa591bdc-c7e1-4131-b41c-dd5043afacbf-iptables-alerter-script\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384460 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-device-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384476 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-kubernetes\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384489 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-env-overrides\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.384579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384504 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-system-cni-dir\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-cni-bin\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-ovn\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384560 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-host\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384584 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr2pk\" (UniqueName: \"kubernetes.io/projected/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-kube-api-access-sr2pk\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-kubelet\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-slash\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-cni-netd\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph244\" (UniqueName: \"kubernetes.io/projected/935b66df-6c0c-487a-a4ff-9539cb02c34d-kube-api-access-ph244\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-registration-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384869 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysconfig\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-sys\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.384973 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.385401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-lib-modules\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-tuned\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-os-release\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-systemd-units\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385099 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-systemd\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-log-socket\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385157 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovn-node-metrics-cert\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385182 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a-agent-certs\") pod \"konnectivity-agent-f4pdp\" (UID: \"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a\") " pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385207 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa591bdc-c7e1-4131-b41c-dd5043afacbf-host-slash\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-var-lib-kubelet\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-tmp\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385316 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfcfc\" (UniqueName: \"kubernetes.io/projected/7b1009d6-de9a-4117-ae9e-b06a56a5138d-kube-api-access-bfcfc\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385337 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-run\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-cnibin\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqk7m\" (UniqueName: \"kubernetes.io/projected/823ea246-d154-4a18-b04f-221eec27416d-kube-api-access-cqk7m\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.385989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovnkube-config\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.386479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-socket-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.386479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.385460 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.417680 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.417645 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:37 +0000 UTC" deadline="2027-12-13 04:10:54.869636296 +0000 UTC" Mar 18 16:44:38.417680 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.417676 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15227h26m16.451963521s" Mar 18 16:44:38.471832 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.471800 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:38.480388 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.480354 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:38.486339 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-var-lib-kubelet\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.486501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-tmp\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.486501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.486501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcfc\" (UniqueName: \"kubernetes.io/projected/7b1009d6-de9a-4117-ae9e-b06a56a5138d-kube-api-access-bfcfc\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.486501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486418 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-var-lib-kubelet\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.486501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-system-cni-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.486501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-k8s-cni-cncf-io\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.486501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486479 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-cni-multus\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-run\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486599 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-run\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-cnibin\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-cni-bin\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486727 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqk7m\" (UniqueName: \"kubernetes.io/projected/823ea246-d154-4a18-b04f-221eec27416d-kube-api-access-cqk7m\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486784 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovnkube-config\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.486825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-cnibin\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.487139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486830 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-socket-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.487139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.487139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486883 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/896404ec-1cea-4a36-9c02-cb5316bac310-hosts-file\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.487139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.486994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpnxg\" (UniqueName: \"kubernetes.io/projected/896404ec-1cea-4a36-9c02-cb5316bac310-kube-api-access-fpnxg\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.487139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487016 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.487344 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487256 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-socket-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.487490 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a-konnectivity-ca\") pod \"konnectivity-agent-f4pdp\" (UID: \"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a\") " pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.487523 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487502 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-cni-binary-copy\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.487566 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-systemd\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.487566 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-etc-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.487652 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487572 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-multus-certs\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.487652 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd7bp\" (UniqueName: \"kubernetes.io/projected/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-kube-api-access-qd7bp\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.487652 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487620 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmcdb\" (UniqueName: \"kubernetes.io/projected/aba28703-3193-48ac-bad1-170ac214d793-kube-api-access-rmcdb\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.487652 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487644 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-kubelet\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.487857 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-hostroot\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.487857 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-conf-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.487857 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487784 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovnkube-config\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.487857 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487787 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/896404ec-1cea-4a36-9c02-cb5316bac310-tmp-dir\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.487857 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-systemd\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.487857 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-etc-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.487857 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-node-log\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487876 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-node-log\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovnkube-script-lib\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.487951 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-netns\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488009 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a-konnectivity-ca\") pod \"konnectivity-agent-f4pdp\" (UID: \"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a\") " pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488117 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-cni-binary-copy\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.488174 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488135 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-etc-kubernetes\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysctl-d\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysctl-d\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488304 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aba28703-3193-48ac-bad1-170ac214d793-serviceca\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488332 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-cni-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-daemon-config\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488382 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-modprobe-d\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysctl-conf\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lprfs\" (UniqueName: \"kubernetes.io/projected/aa591bdc-c7e1-4131-b41c-dd5043afacbf-kube-api-access-lprfs\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-run-netns\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-var-lib-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.488509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488508 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovnkube-script-lib\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-sys-fs\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa591bdc-c7e1-4131-b41c-dd5043afacbf-iptables-alerter-script\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-device-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-sys-fs\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-os-release\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-run-netns\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488648 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-kubernetes\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488666 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-var-lib-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488739 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-env-overrides\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba28703-3193-48ac-bad1-170ac214d793-host\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488788 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-modprobe-d\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488808 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-device-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488828 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-system-cni-dir\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-cni-bin\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488875 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-cni-binary-copy\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.489121 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-socket-dir-parent\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-ovn\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa591bdc-c7e1-4131-b41c-dd5043afacbf-iptables-alerter-script\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489594 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/935b66df-6c0c-487a-a4ff-9539cb02c34d-env-overrides\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489625 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysctl-conf\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489651 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-host\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-system-cni-dir\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.488957 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-ovn\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-kubernetes\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489687 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr2pk\" (UniqueName: \"kubernetes.io/projected/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-kube-api-access-sr2pk\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.489831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489800 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-host\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.490260 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489848 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-cni-bin\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.490260 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.489901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.490260 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.490063 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-kubelet\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.490260 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.490082 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.490260 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.490194 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-slash\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.492420 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492396 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.492530 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-cni-netd\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.492530 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.490223 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-kubelet\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.492638 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.490372 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-slash\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.492638 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ph244\" (UniqueName: \"kubernetes.io/projected/935b66df-6c0c-487a-a4ff-9539cb02c34d-kube-api-access-ph244\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.492805 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492775 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-registration-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.492805 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-registration-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.492998 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-run-openvswitch\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.492998 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.490914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-tmp\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.492998 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-host-cni-netd\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.493450 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.492851 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.493540 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.493498 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysconfig\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.493540 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.493521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-sys\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.493978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-sysconfig\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.493988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.494150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-lib-modules\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.493981 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-sys\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-tuned\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-os-release\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.494150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494111 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-systemd-units\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.494150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-cnibin\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494149 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/823ea246-d154-4a18-b04f-221eec27416d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69vvh\" (UniqueName: \"kubernetes.io/projected/38dc6da4-4394-4935-80a5-6a872bf72125-kube-api-access-69vvh\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-systemd\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-log-socket\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/823ea246-d154-4a18-b04f-221eec27416d-os-release\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494208 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-lib-modules\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovn-node-metrics-cert\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a-agent-certs\") pod \"konnectivity-agent-f4pdp\" (UID: \"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a\") " pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa591bdc-c7e1-4131-b41c-dd5043afacbf-host-slash\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.494414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa591bdc-c7e1-4131-b41c-dd5043afacbf-host-slash\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.494923 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-systemd\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.494923 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-systemd-units\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.494923 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494733 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/935b66df-6c0c-487a-a4ff-9539cb02c34d-log-socket\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.494923 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.494795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b1009d6-de9a-4117-ae9e-b06a56a5138d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.497114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.497037 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-etc-tuned\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.497216 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.497184 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:38.497216 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.497206 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:38.497326 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.497220 2570 projected.go:194] Error preparing data for projected volume kube-api-access-vtcfh for pod openshift-network-diagnostics/network-check-target-x724s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.497377 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.497327 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh podName:b64a1006-8f55-41c1-9d77-457180e9a557 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:38.997279377 +0000 UTC m=+3.102930735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vtcfh" (UniqueName: "kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh") pod "network-check-target-x724s" (UID: "b64a1006-8f55-41c1-9d77-457180e9a557") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.497680 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.497661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a-agent-certs\") pod \"konnectivity-agent-f4pdp\" (UID: \"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a\") " pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.497931 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.497906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/935b66df-6c0c-487a-a4ff-9539cb02c34d-ovn-node-metrics-cert\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.500298 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.500269 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr2pk\" (UniqueName: \"kubernetes.io/projected/1b7bffaa-80d9-47f1-9272-d89e7ac386cb-kube-api-access-sr2pk\") pod \"tuned-4dfx2\" (UID: \"1b7bffaa-80d9-47f1-9272-d89e7ac386cb\") " pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.500674 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.500649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqk7m\" (UniqueName: \"kubernetes.io/projected/823ea246-d154-4a18-b04f-221eec27416d-kube-api-access-cqk7m\") pod \"multus-additional-cni-plugins-ptdgh\" (UID: \"823ea246-d154-4a18-b04f-221eec27416d\") " pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.500957 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.500934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcfc\" (UniqueName: \"kubernetes.io/projected/7b1009d6-de9a-4117-ae9e-b06a56a5138d-kube-api-access-bfcfc\") pod \"aws-ebs-csi-driver-node-9h2v2\" (UID: \"7b1009d6-de9a-4117-ae9e-b06a56a5138d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.501542 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.501507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph244\" (UniqueName: \"kubernetes.io/projected/935b66df-6c0c-487a-a4ff-9539cb02c34d-kube-api-access-ph244\") pod \"ovnkube-node-gk7ln\" (UID: \"935b66df-6c0c-487a-a4ff-9539cb02c34d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.501889 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.501854 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lprfs\" (UniqueName: \"kubernetes.io/projected/aa591bdc-c7e1-4131-b41c-dd5043afacbf-kube-api-access-lprfs\") pod \"iptables-alerter-9bgx9\" (UID: \"aa591bdc-c7e1-4131-b41c-dd5043afacbf\") " pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.594971 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.594925 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-system-cni-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.594971 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.594977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-k8s-cni-cncf-io\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595213 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595003 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-cni-multus\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595213 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595031 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-cni-bin\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595213 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595050 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-k8s-cni-cncf-io\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595213 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/896404ec-1cea-4a36-9c02-cb5316bac310-hosts-file\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.595213 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-system-cni-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595213 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595108 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-cni-multus\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595213 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-cni-bin\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/896404ec-1cea-4a36-9c02-cb5316bac310-hosts-file\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpnxg\" (UniqueName: \"kubernetes.io/projected/896404ec-1cea-4a36-9c02-cb5316bac310-kube-api-access-fpnxg\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-multus-certs\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595304 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd7bp\" (UniqueName: \"kubernetes.io/projected/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-kube-api-access-qd7bp\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmcdb\" (UniqueName: \"kubernetes.io/projected/aba28703-3193-48ac-bad1-170ac214d793-kube-api-access-rmcdb\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-kubelet\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595381 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-hostroot\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595405 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-conf-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-multus-certs\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595456 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-var-lib-kubelet\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-hostroot\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595516 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-conf-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/896404ec-1cea-4a36-9c02-cb5316bac310-tmp-dir\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595577 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-netns\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-etc-kubernetes\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595640 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-host-run-netns\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aba28703-3193-48ac-bad1-170ac214d793-serviceca\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595675 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-etc-kubernetes\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-cni-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595752 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-daemon-config\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-cni-dir\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595815 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-os-release\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba28703-3193-48ac-bad1-170ac214d793-host\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595868 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-cni-binary-copy\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-socket-dir-parent\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.595914 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-cnibin\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/896404ec-1cea-4a36-9c02-cb5316bac310-tmp-dir\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.595973 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.595956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69vvh\" (UniqueName: \"kubernetes.io/projected/38dc6da4-4394-4935-80a5-6a872bf72125-kube-api-access-69vvh\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.595999 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:39.095977621 +0000 UTC m=+3.201628963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.596119 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aba28703-3193-48ac-bad1-170ac214d793-serviceca\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.596155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-socket-dir-parent\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.596188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-cnibin\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.596199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba28703-3193-48ac-bad1-170ac214d793-host\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.596206 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-os-release\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.596296 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-multus-daemon-config\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.596630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.596321 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-cni-binary-copy\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.607759 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.607661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpnxg\" (UniqueName: \"kubernetes.io/projected/896404ec-1cea-4a36-9c02-cb5316bac310-kube-api-access-fpnxg\") pod \"node-resolver-blj9x\" (UID: \"896404ec-1cea-4a36-9c02-cb5316bac310\") " pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.608208 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.608181 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmcdb\" (UniqueName: \"kubernetes.io/projected/aba28703-3193-48ac-bad1-170ac214d793-kube-api-access-rmcdb\") pod \"node-ca-8mt8g\" (UID: \"aba28703-3193-48ac-bad1-170ac214d793\") " pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.608208 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.608197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69vvh\" (UniqueName: \"kubernetes.io/projected/38dc6da4-4394-4935-80a5-6a872bf72125-kube-api-access-69vvh\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:38.608392 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.608184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd7bp\" (UniqueName: \"kubernetes.io/projected/f7a45c76-25de-47f6-8a92-fe9ea77a8a9c-kube-api-access-qd7bp\") pod \"multus-cnpd6\" (UID: \"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c\") " pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.680657 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.680613 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" Mar 18 16:44:38.689778 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.689740 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" Mar 18 16:44:38.697492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.697457 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:38.702142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.702102 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" Mar 18 16:44:38.709865 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.709836 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9bgx9" Mar 18 16:44:38.710524 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.710363 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:38.716622 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.716597 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:44:38.723324 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.723296 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blj9x" Mar 18 16:44:38.729047 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.729020 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8mt8g" Mar 18 16:44:38.733729 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.733687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cnpd6" Mar 18 16:44:38.998460 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:38.998424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:38.998635 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.998599 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:38.998635 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.998625 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:38.998780 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.998639 2570 projected.go:194] Error preparing data for projected volume kube-api-access-vtcfh for pod openshift-network-diagnostics/network-check-target-x724s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.998780 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:38.998708 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh podName:b64a1006-8f55-41c1-9d77-457180e9a557 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:39.998681322 +0000 UTC m=+4.104332661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vtcfh" (UniqueName: "kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh") pod "network-check-target-x724s" (UID: "b64a1006-8f55-41c1-9d77-457180e9a557") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:39.099089 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.099048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:39.099275 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:39.099189 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:39.099275 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:39.099261 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.099242962 +0000 UTC m=+4.204894301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:39.170788 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:39.170746 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod823ea246_d154_4a18_b04f_221eec27416d.slice/crio-71f01c7b3493f1812f11d0f079d7a394e19634c2e3cd309e9e2ab1c1ac162a5a WatchSource:0}: Error finding container 71f01c7b3493f1812f11d0f079d7a394e19634c2e3cd309e9e2ab1c1ac162a5a: Status 404 returned error can't find the container with id 71f01c7b3493f1812f11d0f079d7a394e19634c2e3cd309e9e2ab1c1ac162a5a Mar 18 16:44:39.172817 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:39.172787 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a45c76_25de_47f6_8a92_fe9ea77a8a9c.slice/crio-a9cd5e92d933af5c5eb54374f16acf92cf3faa61ecad7a4469254e5da274d543 WatchSource:0}: Error finding container a9cd5e92d933af5c5eb54374f16acf92cf3faa61ecad7a4469254e5da274d543: Status 404 returned error can't find the container with id a9cd5e92d933af5c5eb54374f16acf92cf3faa61ecad7a4469254e5da274d543 Mar 18 16:44:39.175179 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:39.175150 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935b66df_6c0c_487a_a4ff_9539cb02c34d.slice/crio-c61e16745e0817c0c3af3e41b5acf75bfdc172784427f1574994b73db4a530e0 WatchSource:0}: Error finding container c61e16745e0817c0c3af3e41b5acf75bfdc172784427f1574994b73db4a530e0: Status 404 returned error can't find the container with id c61e16745e0817c0c3af3e41b5acf75bfdc172784427f1574994b73db4a530e0 Mar 18 16:44:39.176855 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:39.176833 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaba28703_3193_48ac_bad1_170ac214d793.slice/crio-4c43219f49953535a368b81cd9450d597c54d4d99934b5990f65a88ef00f6796 WatchSource:0}: Error finding container 4c43219f49953535a368b81cd9450d597c54d4d99934b5990f65a88ef00f6796: Status 404 returned error can't find the container with id 4c43219f49953535a368b81cd9450d597c54d4d99934b5990f65a88ef00f6796 Mar 18 16:44:39.177182 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:44:39.177155 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b1009d6_de9a_4117_ae9e_b06a56a5138d.slice/crio-8fc5dedda721534c21c458a5847051b20c991bd4b047b57129343f96ccf2e37f WatchSource:0}: Error finding container 8fc5dedda721534c21c458a5847051b20c991bd4b047b57129343f96ccf2e37f: Status 404 returned error can't find the container with id 8fc5dedda721534c21c458a5847051b20c991bd4b047b57129343f96ccf2e37f Mar 18 16:44:39.418608 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.418297 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:37 +0000 UTC" deadline="2027-12-25 18:14:12.273798902 +0000 UTC" Mar 18 16:44:39.418608 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.418513 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15529h29m32.855293873s" Mar 18 16:44:39.515493 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.515459 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:39.515671 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:39.515603 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:39.516072 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.516053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:39.516180 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:39.516162 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:39.522270 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.522225 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9bgx9" event={"ID":"aa591bdc-c7e1-4131-b41c-dd5043afacbf","Type":"ContainerStarted","Data":"d769c8ee3b4e1cf310e89d8738a252aab0fe5f1bfbe3bb3077cd2a1e8db6c237"} Mar 18 16:44:39.523488 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.523440 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blj9x" event={"ID":"896404ec-1cea-4a36-9c02-cb5316bac310","Type":"ContainerStarted","Data":"c1495dbcb3daf49347cb7d3b520743e6cb152325b8c216610a13abd20a868b6b"} Mar 18 16:44:39.525786 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.525748 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f4pdp" event={"ID":"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a","Type":"ContainerStarted","Data":"3726597e67e45359921b9c4e78dde5bc118c7df4b5f244ed0853e08d0b30ac79"} Mar 18 16:44:39.529243 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.529207 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"c61e16745e0817c0c3af3e41b5acf75bfdc172784427f1574994b73db4a530e0"} Mar 18 16:44:39.533965 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.533930 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerStarted","Data":"71f01c7b3493f1812f11d0f079d7a394e19634c2e3cd309e9e2ab1c1ac162a5a"} Mar 18 16:44:39.539381 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.538651 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" event={"ID":"7a62256ddb99a5d51981b06e48f8ed26","Type":"ContainerStarted","Data":"181ab2404a8ca617a70baa8fe7423554d2b2ca2498a8e5869650675445b96d9c"} Mar 18 16:44:39.540303 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.540254 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" event={"ID":"1b7bffaa-80d9-47f1-9272-d89e7ac386cb","Type":"ContainerStarted","Data":"ba92a0d7c64cd594e9ad7ed6312f83b8a9df24fa3dc6b7119695ef10db00c80f"} Mar 18 16:44:39.544438 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.544230 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" event={"ID":"7b1009d6-de9a-4117-ae9e-b06a56a5138d","Type":"ContainerStarted","Data":"8fc5dedda721534c21c458a5847051b20c991bd4b047b57129343f96ccf2e37f"} Mar 18 16:44:39.545649 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.545513 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8mt8g" event={"ID":"aba28703-3193-48ac-bad1-170ac214d793","Type":"ContainerStarted","Data":"4c43219f49953535a368b81cd9450d597c54d4d99934b5990f65a88ef00f6796"} Mar 18 16:44:39.549064 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.549025 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cnpd6" event={"ID":"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c","Type":"ContainerStarted","Data":"a9cd5e92d933af5c5eb54374f16acf92cf3faa61ecad7a4469254e5da274d543"} Mar 18 16:44:39.585382 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:39.584419 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-201.ec2.internal" podStartSLOduration=2.58440391 podStartE2EDuration="2.58440391s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:39.583975883 +0000 UTC m=+3.689627244" watchObservedRunningTime="2026-03-18 16:44:39.58440391 +0000 UTC m=+3.690055271" Mar 18 16:44:40.004873 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:40.004820 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:40.005060 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:40.005016 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:40.005060 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:40.005035 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:40.005060 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:40.005047 2570 projected.go:194] Error preparing data for projected volume kube-api-access-vtcfh for pod openshift-network-diagnostics/network-check-target-x724s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:40.005217 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:40.005106 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh podName:b64a1006-8f55-41c1-9d77-457180e9a557 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:42.005087635 +0000 UTC m=+6.110738979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vtcfh" (UniqueName: "kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh") pod "network-check-target-x724s" (UID: "b64a1006-8f55-41c1-9d77-457180e9a557") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:40.106569 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:40.106537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:40.106789 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:40.106769 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.106859 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:40.106843 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:42.106823113 +0000 UTC m=+6.212474474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.560858 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:40.560817 2570 generic.go:358] "Generic (PLEG): container finished" podID="f6cf386473d7d5f0f36c10779f62f49b" containerID="8734d468246fa5c95bbccbdf519bfcc0e97210a490837b0db16ba867b7a38db4" exitCode=0 Mar 18 16:44:40.561898 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:40.561868 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" event={"ID":"f6cf386473d7d5f0f36c10779f62f49b","Type":"ContainerDied","Data":"8734d468246fa5c95bbccbdf519bfcc0e97210a490837b0db16ba867b7a38db4"} Mar 18 16:44:41.514794 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:41.514761 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:41.515010 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:41.514906 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:41.515501 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:41.515343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:41.515501 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:41.515433 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:41.570135 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:41.570098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" event={"ID":"f6cf386473d7d5f0f36c10779f62f49b","Type":"ContainerStarted","Data":"d10ae99b46a5433112c9f79a3c6452f90e4c96440ba821230c51f19bdbbda42d"} Mar 18 16:44:42.023369 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.023331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:42.023579 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.023522 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:42.023579 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.023542 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:42.023579 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.023555 2570 projected.go:194] Error preparing data for projected volume kube-api-access-vtcfh for pod openshift-network-diagnostics/network-check-target-x724s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:42.023818 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.023615 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh podName:b64a1006-8f55-41c1-9d77-457180e9a557 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:46.023597513 +0000 UTC m=+10.129248858 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vtcfh" (UniqueName: "kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh") pod "network-check-target-x724s" (UID: "b64a1006-8f55-41c1-9d77-457180e9a557") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:42.124341 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.123743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:42.124341 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.123923 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:42.124341 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.123997 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:46.123976284 +0000 UTC m=+10.229627626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:42.481178 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.481123 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-201.ec2.internal" podStartSLOduration=5.481106245 podStartE2EDuration="5.481106245s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:41.588517651 +0000 UTC m=+5.694169014" watchObservedRunningTime="2026-03-18 16:44:42.481106245 +0000 UTC m=+6.586757606" Mar 18 16:44:42.482394 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.481768 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hdpjw"] Mar 18 16:44:42.484740 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.484645 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.484889 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.484751 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:42.528230 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.528194 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f033a5d8-a3ec-47df-9593-01096396aeb5-dbus\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.528414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.528247 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.528414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.528333 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f033a5d8-a3ec-47df-9593-01096396aeb5-kubelet-config\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.628929 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.628895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f033a5d8-a3ec-47df-9593-01096396aeb5-kubelet-config\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.629369 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.628947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f033a5d8-a3ec-47df-9593-01096396aeb5-dbus\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.629369 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.628972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.629369 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.629166 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:42.629369 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:42.629269 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret podName:f033a5d8-a3ec-47df-9593-01096396aeb5 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.129248708 +0000 UTC m=+7.234900067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret") pod "global-pull-secret-syncer-hdpjw" (UID: "f033a5d8-a3ec-47df-9593-01096396aeb5") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:42.629581 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.629549 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f033a5d8-a3ec-47df-9593-01096396aeb5-kubelet-config\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:42.629802 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:42.629779 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f033a5d8-a3ec-47df-9593-01096396aeb5-dbus\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:43.134664 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:43.134606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:43.134874 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:43.134852 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:43.135021 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:43.134983 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret podName:f033a5d8-a3ec-47df-9593-01096396aeb5 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:44.134960846 +0000 UTC m=+8.240612187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret") pod "global-pull-secret-syncer-hdpjw" (UID: "f033a5d8-a3ec-47df-9593-01096396aeb5") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:43.515180 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:43.515094 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:43.515349 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:43.515094 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:43.515349 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:43.515264 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:43.515349 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:43.515314 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:44.145002 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:44.144960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:44.145457 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:44.145090 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:44.145457 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:44.145144 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret podName:f033a5d8-a3ec-47df-9593-01096396aeb5 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:46.14512913 +0000 UTC m=+10.250780473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret") pod "global-pull-secret-syncer-hdpjw" (UID: "f033a5d8-a3ec-47df-9593-01096396aeb5") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:44.515740 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:44.515239 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:44.515740 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:44.515370 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:45.515270 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:45.514781 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:45.515270 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:45.514784 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:45.515270 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:45.514901 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:45.515270 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:45.515080 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:46.061112 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:46.060975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:46.061301 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.061170 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:46.061301 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.061200 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:46.061301 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.061213 2570 projected.go:194] Error preparing data for projected volume kube-api-access-vtcfh for pod openshift-network-diagnostics/network-check-target-x724s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:46.061301 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.061281 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh podName:b64a1006-8f55-41c1-9d77-457180e9a557 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.061262661 +0000 UTC m=+18.166914001 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vtcfh" (UniqueName: "kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh") pod "network-check-target-x724s" (UID: "b64a1006-8f55-41c1-9d77-457180e9a557") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:46.162175 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:46.162136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:46.162376 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:46.162230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:46.162448 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.162376 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:46.162448 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.162439 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret podName:f033a5d8-a3ec-47df-9593-01096396aeb5 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:50.162421734 +0000 UTC m=+14.268073097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret") pod "global-pull-secret-syncer-hdpjw" (UID: "f033a5d8-a3ec-47df-9593-01096396aeb5") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:46.162957 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.162914 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:46.163058 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.162998 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.162981207 +0000 UTC m=+18.268632548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:46.515421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:46.515338 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:46.515824 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:46.515466 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:47.515482 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:47.514986 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:47.515482 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:47.515100 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:47.515482 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:47.514987 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:47.515482 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:47.515444 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:48.515355 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:48.515316 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:48.515524 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:48.515460 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:49.514794 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:49.514756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:49.515028 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:49.514891 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:49.515028 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:49.514941 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:49.515128 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:49.515043 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:50.193925 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:50.193874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:50.194415 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:50.194021 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:50.194415 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:50.194106 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret podName:f033a5d8-a3ec-47df-9593-01096396aeb5 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:58.194083284 +0000 UTC m=+22.299734636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret") pod "global-pull-secret-syncer-hdpjw" (UID: "f033a5d8-a3ec-47df-9593-01096396aeb5") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:50.514896 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:50.514808 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:50.515077 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:50.514965 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:51.514915 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:51.514881 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:51.514915 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:51.514923 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:51.515390 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:51.515030 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:51.515390 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:51.515184 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:52.514960 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:52.514911 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:52.515346 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:52.515063 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:53.515308 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:53.515269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:53.515846 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:53.515269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:53.515846 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:53.515395 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:53.515846 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:53.515492 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:54.121681 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:54.121636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:54.121865 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:54.121835 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:54.121904 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:54.121864 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:54.121904 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:54.121877 2570 projected.go:194] Error preparing data for projected volume kube-api-access-vtcfh for pod openshift-network-diagnostics/network-check-target-x724s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:54.121962 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:54.121937 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh podName:b64a1006-8f55-41c1-9d77-457180e9a557 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:10.121920091 +0000 UTC m=+34.227571450 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vtcfh" (UniqueName: "kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh") pod "network-check-target-x724s" (UID: "b64a1006-8f55-41c1-9d77-457180e9a557") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:54.222558 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:54.222511 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:54.222763 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:54.222641 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:54.222763 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:54.222731 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:10.222693084 +0000 UTC m=+34.328344440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:54.514614 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:54.514540 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:54.514768 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:54.514645 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:55.514874 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:55.514839 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:55.515321 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:55.514841 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:55.515321 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:55.514990 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:55.515321 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:55.515073 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:56.516499 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:56.516457 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:56.516892 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:56.516580 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:57.514903 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.514450 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:57.515114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.514479 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:57.515114 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:57.515032 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:57.515114 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:57.515085 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:57.604060 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.604019 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" event={"ID":"1b7bffaa-80d9-47f1-9272-d89e7ac386cb","Type":"ContainerStarted","Data":"79fcbccf3b7dd2ae0813c0e3738854bd10358152b45d32d8eaa28cfce49dc185"} Mar 18 16:44:57.605454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.605429 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" event={"ID":"7b1009d6-de9a-4117-ae9e-b06a56a5138d","Type":"ContainerStarted","Data":"a7e798241f5bca7e8a9eaf4cf0c7a50713504d8147d2174d8918caafe8d8bad7"} Mar 18 16:44:57.606798 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.606767 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8mt8g" event={"ID":"aba28703-3193-48ac-bad1-170ac214d793","Type":"ContainerStarted","Data":"04405df003bd1838d08e5437629242e89c3df48ca56699a6aca9cb0ddca53d83"} Mar 18 16:44:57.608120 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.608094 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cnpd6" event={"ID":"f7a45c76-25de-47f6-8a92-fe9ea77a8a9c","Type":"ContainerStarted","Data":"3229a99b2438921e3398e989d9d1df3e3b5df264b41c54b55953832e52d1691e"} Mar 18 16:44:57.609401 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.609379 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blj9x" event={"ID":"896404ec-1cea-4a36-9c02-cb5316bac310","Type":"ContainerStarted","Data":"49f8c6497646b2d1b72c38d456e9545b6b73090502a95fa247b85c39e49f1b22"} Mar 18 16:44:57.610613 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.610583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f4pdp" event={"ID":"4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a","Type":"ContainerStarted","Data":"236383031dcbf6d19770a35ed11b518d94022f6f9090738cc8f45e7657547a9e"} Mar 18 16:44:57.613028 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613011 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:44:57.613281 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613262 2570 generic.go:358] "Generic (PLEG): container finished" podID="935b66df-6c0c-487a-a4ff-9539cb02c34d" containerID="12618cccaab402a756e15f66f2cbbb92b256f54be196f089a429c395ddcbfe6c" exitCode=1 Mar 18 16:44:57.613338 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613319 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"4300bbdedfa8161ea53e1cf030c4e06c843d6d5fbfb051b3332740b0af3012d5"} Mar 18 16:44:57.613375 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"492480e7abb971878c7857462a088f2009ff02c73a394bfd3c34d6880af17661"} Mar 18 16:44:57.613375 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613353 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"ffb136c8f30530b7e62006fed45639cb9d23fcce8c01280d587cdf1a40558d3b"} Mar 18 16:44:57.613375 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613361 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"2828f07bce91144287e403c0145552a4d6350638fdce3d46e0e5d30f2ef6b30d"} Mar 18 16:44:57.613375 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613369 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerDied","Data":"12618cccaab402a756e15f66f2cbbb92b256f54be196f089a429c395ddcbfe6c"} Mar 18 16:44:57.613540 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.613380 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"938600b3d6e324f22834cc7c613eff826b4c7685e45e7114c961f2ca0db4c63e"} Mar 18 16:44:57.614408 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.614389 2570 generic.go:358] "Generic (PLEG): container finished" podID="823ea246-d154-4a18-b04f-221eec27416d" containerID="a2172667abfbd3c49ea626e88e20164947402af51d10bb5e9a4e93c15a7038cb" exitCode=0 Mar 18 16:44:57.614487 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.614429 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerDied","Data":"a2172667abfbd3c49ea626e88e20164947402af51d10bb5e9a4e93c15a7038cb"} Mar 18 16:44:57.622569 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.622519 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4dfx2" podStartSLOduration=4.179810046 podStartE2EDuration="21.622507353s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.180605521 +0000 UTC m=+3.286256876" lastFinishedPulling="2026-03-18 16:44:56.623302843 +0000 UTC m=+20.728954183" observedRunningTime="2026-03-18 16:44:57.621919291 +0000 UTC m=+21.727570664" watchObservedRunningTime="2026-03-18 16:44:57.622507353 +0000 UTC m=+21.728158714" Mar 18 16:44:57.652287 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.652231 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-blj9x" podStartSLOduration=4.235430541 podStartE2EDuration="21.652217811s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.182871743 +0000 UTC m=+3.288523084" lastFinishedPulling="2026-03-18 16:44:56.599659002 +0000 UTC m=+20.705310354" observedRunningTime="2026-03-18 16:44:57.651911965 +0000 UTC m=+21.757563324" watchObservedRunningTime="2026-03-18 16:44:57.652217811 +0000 UTC m=+21.757869171" Mar 18 16:44:57.652456 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.652298 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8mt8g" podStartSLOduration=4.230672954 podStartE2EDuration="21.652294508s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.178132826 +0000 UTC m=+3.283784180" lastFinishedPulling="2026-03-18 16:44:56.599754391 +0000 UTC m=+20.705405734" observedRunningTime="2026-03-18 16:44:57.636642595 +0000 UTC m=+21.742293955" watchObservedRunningTime="2026-03-18 16:44:57.652294508 +0000 UTC m=+21.757945868" Mar 18 16:44:57.672010 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.671958 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cnpd6" podStartSLOduration=4.1993627 podStartE2EDuration="21.671945195s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.174669522 +0000 UTC m=+3.280320876" lastFinishedPulling="2026-03-18 16:44:56.647252016 +0000 UTC m=+20.752903371" observedRunningTime="2026-03-18 16:44:57.671391035 +0000 UTC m=+21.777042396" watchObservedRunningTime="2026-03-18 16:44:57.671945195 +0000 UTC m=+21.777596555" Mar 18 16:44:57.685819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:57.685769 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-f4pdp" podStartSLOduration=9.125032291 podStartE2EDuration="21.685755228s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.181529248 +0000 UTC m=+3.287180587" lastFinishedPulling="2026-03-18 16:44:51.742252181 +0000 UTC m=+15.847903524" observedRunningTime="2026-03-18 16:44:57.685545141 +0000 UTC m=+21.791196503" watchObservedRunningTime="2026-03-18 16:44:57.685755228 +0000 UTC m=+21.791406588" Mar 18 16:44:58.255875 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.255839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:58.256083 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:58.256007 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:58.256146 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:58.256084 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret podName:f033a5d8-a3ec-47df-9593-01096396aeb5 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:14.256062935 +0000 UTC m=+38.361714278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret") pod "global-pull-secret-syncer-hdpjw" (UID: "f033a5d8-a3ec-47df-9593-01096396aeb5") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:58.281762 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.281732 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:58.450559 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.450250 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:58.281751447Z","UUID":"6eb1b1d2-15c1-4c3e-b038-db3f3d9cb469","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:58.453340 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.453314 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:58.453340 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.453348 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:58.515202 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.515157 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:44:58.515382 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:58.515310 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:44:58.619126 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.619089 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" event={"ID":"7b1009d6-de9a-4117-ae9e-b06a56a5138d","Type":"ContainerStarted","Data":"bdde86402bd15cad64e3a42736d93a5314c650f79c03de91dde48042b7cd419f"} Mar 18 16:44:58.620505 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.620460 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9bgx9" event={"ID":"aa591bdc-c7e1-4131-b41c-dd5043afacbf","Type":"ContainerStarted","Data":"3cb249f03f213131d29cfb1d2a34523edbec4479c9a823c023bc4475cb73e147"} Mar 18 16:44:58.636980 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.636925 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9bgx9" podStartSLOduration=5.198106214 podStartE2EDuration="22.636910376s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.184433168 +0000 UTC m=+3.290084509" lastFinishedPulling="2026-03-18 16:44:56.62323733 +0000 UTC m=+20.728888671" observedRunningTime="2026-03-18 16:44:58.636459656 +0000 UTC m=+22.742111017" watchObservedRunningTime="2026-03-18 16:44:58.636910376 +0000 UTC m=+22.742561736" Mar 18 16:44:58.917357 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.917319 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:58.921455 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:58.921251 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:59.515234 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:59.514725 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:44:59.515234 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:59.514746 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:44:59.515234 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:59.514900 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:44:59.515234 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:44:59.515037 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:44:59.626380 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:59.626124 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:44:59.626981 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:59.626927 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"85063fa6f74b879084251e4e84e86d79037bf34a4d55de6da6314219dbc9e1c8"} Mar 18 16:44:59.627382 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:59.627359 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:44:59.628013 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:44:59.627988 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-f4pdp" Mar 18 16:45:00.514737 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:00.514689 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:00.514965 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:00.514831 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:45:00.631590 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:00.631549 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" event={"ID":"7b1009d6-de9a-4117-ae9e-b06a56a5138d","Type":"ContainerStarted","Data":"b12c4462268b838491ee9ac5d596a1cb7188b18fcc291027d2c5adfadf539bbb"} Mar 18 16:45:00.651603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:00.651550 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9h2v2" podStartSLOduration=4.117381683 podStartE2EDuration="24.651531993s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.179900072 +0000 UTC m=+3.285551429" lastFinishedPulling="2026-03-18 16:44:59.714050393 +0000 UTC m=+23.819701739" observedRunningTime="2026-03-18 16:45:00.651449384 +0000 UTC m=+24.757100745" watchObservedRunningTime="2026-03-18 16:45:00.651531993 +0000 UTC m=+24.757183354" Mar 18 16:45:01.515123 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:01.515083 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:01.515316 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:01.515083 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:01.515316 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:01.515225 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:45:01.515316 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:01.515265 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:45:02.515740 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.515484 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:02.516325 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:02.515776 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:45:02.637981 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.637949 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:45:02.638320 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.638296 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"dfb2fd7a84a7fc5f827170933d462ca6be25d43ff13d37a4ed677db969fcf0c6"} Mar 18 16:45:02.638663 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.638632 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:45:02.638819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.638800 2570 scope.go:117] "RemoveContainer" containerID="12618cccaab402a756e15f66f2cbbb92b256f54be196f089a429c395ddcbfe6c" Mar 18 16:45:02.640327 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.640097 2570 generic.go:358] "Generic (PLEG): container finished" podID="823ea246-d154-4a18-b04f-221eec27416d" containerID="e5f13eddc667b89f74850dbc1227225b035bd72ede6f5b01dbb5a8e9bf87522e" exitCode=0 Mar 18 16:45:02.640327 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.640149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerDied","Data":"e5f13eddc667b89f74850dbc1227225b035bd72ede6f5b01dbb5a8e9bf87522e"} Mar 18 16:45:02.655425 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:02.655401 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:45:03.515205 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.515177 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:03.515330 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.515179 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:03.515330 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:03.515284 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:45:03.515412 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:03.515366 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:45:03.645290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.645253 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:45:03.645664 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.645594 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" event={"ID":"935b66df-6c0c-487a-a4ff-9539cb02c34d","Type":"ContainerStarted","Data":"4a9238ffc03a7eac31928f70419ade50d091900aa4b7f442a3874c02bb9411c0"} Mar 18 16:45:03.645811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.645795 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:45:03.645865 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.645817 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:45:03.660323 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.660290 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hdpjw"] Mar 18 16:45:03.660476 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.660426 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:03.660554 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:03.660535 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:45:03.662234 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.662206 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktpw5"] Mar 18 16:45:03.662405 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.662310 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:03.662468 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:03.662401 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:45:03.662818 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.662794 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x724s"] Mar 18 16:45:03.662926 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.662877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:03.663000 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:03.662963 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:45:03.666191 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.666164 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:45:03.679616 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:03.679531 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" podStartSLOduration=10.160992462 podStartE2EDuration="27.679511553s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.177870384 +0000 UTC m=+3.283521738" lastFinishedPulling="2026-03-18 16:44:56.696389297 +0000 UTC m=+20.802040829" observedRunningTime="2026-03-18 16:45:03.679085275 +0000 UTC m=+27.784736636" watchObservedRunningTime="2026-03-18 16:45:03.679511553 +0000 UTC m=+27.785162917" Mar 18 16:45:04.649087 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:04.648891 2570 generic.go:358] "Generic (PLEG): container finished" podID="823ea246-d154-4a18-b04f-221eec27416d" containerID="bf07fd7061613f634a4de7571877e8137fe95df7582fbdbad17a26a132a2af28" exitCode=0 Mar 18 16:45:04.649493 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:04.648974 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerDied","Data":"bf07fd7061613f634a4de7571877e8137fe95df7582fbdbad17a26a132a2af28"} Mar 18 16:45:05.514810 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:05.514768 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:05.514810 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:05.514797 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:05.515069 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:05.514768 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:05.515069 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:05.514927 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:45:05.515069 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:05.514995 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:45:05.515069 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:05.515026 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:45:06.655472 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:06.655375 2570 generic.go:358] "Generic (PLEG): container finished" podID="823ea246-d154-4a18-b04f-221eec27416d" containerID="0be91b7201f573a5acf7f9bbf260efb40f685c4415b5f44e83dcfb7c149df3a6" exitCode=0 Mar 18 16:45:06.655472 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:06.655426 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerDied","Data":"0be91b7201f573a5acf7f9bbf260efb40f685c4415b5f44e83dcfb7c149df3a6"} Mar 18 16:45:07.514465 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:07.514423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:07.514630 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:07.514423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:07.514630 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:07.514592 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:45:07.514770 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:07.514423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:07.514770 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:07.514663 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:45:07.514770 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:07.514751 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:45:09.514903 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:09.514869 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:09.515538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:09.515000 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:09.515538 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:09.515005 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hdpjw" podUID="f033a5d8-a3ec-47df-9593-01096396aeb5" Mar 18 16:45:09.515538 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:09.515121 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:45:09.515538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:09.515176 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:09.515538 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:09.515257 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x724s" podUID="b64a1006-8f55-41c1-9d77-457180e9a557" Mar 18 16:45:10.145968 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.145927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:10.146276 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:10.146071 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:45:10.146276 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:10.146097 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:45:10.146276 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:10.146109 2570 projected.go:194] Error preparing data for projected volume kube-api-access-vtcfh for pod openshift-network-diagnostics/network-check-target-x724s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:45:10.146276 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:10.146176 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh podName:b64a1006-8f55-41c1-9d77-457180e9a557 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:42.146160364 +0000 UTC m=+66.251811705 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vtcfh" (UniqueName: "kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh") pod "network-check-target-x724s" (UID: "b64a1006-8f55-41c1-9d77-457180e9a557") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:45:10.247177 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.247128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:10.247373 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:10.247303 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:45:10.247373 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:10.247365 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:42.247345848 +0000 UTC m=+66.352997187 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:45:10.728612 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.728528 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-201.ec2.internal" event="NodeReady" Mar 18 16:45:10.729132 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.728691 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:45:10.776149 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.776112 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t4psl"] Mar 18 16:45:10.799447 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.799415 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4vc72"] Mar 18 16:45:10.799637 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.799615 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:10.802609 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.802578 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:45:10.802775 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.802593 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpjpl\"" Mar 18 16:45:10.802775 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.802593 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:45:10.812781 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.812750 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t4psl"] Mar 18 16:45:10.812781 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.812783 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4vc72"] Mar 18 16:45:10.812977 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.812905 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:10.816204 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.815977 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:45:10.816204 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.815977 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:45:10.816204 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.816104 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v9vx\"" Mar 18 16:45:10.816437 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.816348 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:45:10.952803 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.952755 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:10.952990 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.952896 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdn9\" (UniqueName: \"kubernetes.io/projected/d386ca96-8632-46cd-b756-90a53fad9ef1-kube-api-access-tmdn9\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:10.952990 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.952951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srfkj\" (UniqueName: \"kubernetes.io/projected/529555a4-f4da-4842-814f-1acffad52caf-kube-api-access-srfkj\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:10.952990 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.952980 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d386ca96-8632-46cd-b756-90a53fad9ef1-config-volume\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:10.953150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.953008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:10.953150 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:10.953026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d386ca96-8632-46cd-b756-90a53fad9ef1-tmp-dir\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.054223 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.054179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdn9\" (UniqueName: \"kubernetes.io/projected/d386ca96-8632-46cd-b756-90a53fad9ef1-kube-api-access-tmdn9\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.054421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.054250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srfkj\" (UniqueName: \"kubernetes.io/projected/529555a4-f4da-4842-814f-1acffad52caf-kube-api-access-srfkj\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:11.054421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.054276 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d386ca96-8632-46cd-b756-90a53fad9ef1-config-volume\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.054421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.054306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.054421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.054333 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d386ca96-8632-46cd-b756-90a53fad9ef1-tmp-dir\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.054421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.054383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:11.054601 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.054496 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:11.054601 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.054496 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:11.054601 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.054551 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.554532774 +0000 UTC m=+35.660184133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:45:11.054601 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.054572 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.554558254 +0000 UTC m=+35.660209593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:45:11.054865 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.054845 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d386ca96-8632-46cd-b756-90a53fad9ef1-tmp-dir\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.066368 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.066337 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d386ca96-8632-46cd-b756-90a53fad9ef1-config-volume\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.068194 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.068164 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdn9\" (UniqueName: \"kubernetes.io/projected/d386ca96-8632-46cd-b756-90a53fad9ef1-kube-api-access-tmdn9\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.068330 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.068282 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srfkj\" (UniqueName: \"kubernetes.io/projected/529555a4-f4da-4842-814f-1acffad52caf-kube-api-access-srfkj\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:11.515387 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.515298 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:11.515575 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.515321 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:11.515575 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.515440 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:11.518244 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.518215 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:45:11.519666 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.519646 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:11.519801 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.519686 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:11.519801 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.519782 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-97tcn\"" Mar 18 16:45:11.519907 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.519881 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:11.519959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.519915 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wwjb5\"" Mar 18 16:45:11.558925 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.558888 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:11.559124 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:11.558955 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:11.559124 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.559059 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:11.559124 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.559113 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:11.559282 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.559142 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.559122509 +0000 UTC m=+36.664773848 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:45:11.559282 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:11.559172 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.5591559 +0000 UTC m=+36.664807244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:45:12.569369 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:12.569186 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:12.569369 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:12.569350 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:12.569843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:12.569407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:12.569843 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:12.569421 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:14.5694006 +0000 UTC m=+38.675051941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:45:12.569843 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:12.569495 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:12.569843 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:12.569552 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:14.569537697 +0000 UTC m=+38.675189039 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:45:12.670520 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:12.670437 2570 generic.go:358] "Generic (PLEG): container finished" podID="823ea246-d154-4a18-b04f-221eec27416d" containerID="fb60c5e3bc86679d2b59de0dc0ec3f10b3be314e091e075fc9115d7271db786a" exitCode=0 Mar 18 16:45:12.670520 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:12.670482 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerDied","Data":"fb60c5e3bc86679d2b59de0dc0ec3f10b3be314e091e075fc9115d7271db786a"} Mar 18 16:45:13.675299 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:13.675262 2570 generic.go:358] "Generic (PLEG): container finished" podID="823ea246-d154-4a18-b04f-221eec27416d" containerID="e96a985bfc308f133ca10ed08a392ef0f16b978597477f3c3342c3e2163fd850" exitCode=0 Mar 18 16:45:13.675653 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:13.675307 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerDied","Data":"e96a985bfc308f133ca10ed08a392ef0f16b978597477f3c3342c3e2163fd850"} Mar 18 16:45:14.283625 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.283571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:14.287167 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.287140 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f033a5d8-a3ec-47df-9593-01096396aeb5-original-pull-secret\") pod \"global-pull-secret-syncer-hdpjw\" (UID: \"f033a5d8-a3ec-47df-9593-01096396aeb5\") " pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:14.536391 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.536271 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hdpjw" Mar 18 16:45:14.586440 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.586401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:14.586627 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.586453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:14.586627 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:14.586585 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:14.586627 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:14.586608 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:14.586873 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:14.586637 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:18.58662386 +0000 UTC m=+42.692275199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:45:14.586873 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:14.586680 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:18.586657395 +0000 UTC m=+42.692308735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:45:14.679013 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.678959 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hdpjw"] Mar 18 16:45:14.681258 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.681231 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" event={"ID":"823ea246-d154-4a18-b04f-221eec27416d","Type":"ContainerStarted","Data":"d24ba034730a34baf017a8c7aab69215e4215e0b5ce6702d01717e58cb65a06b"} Mar 18 16:45:14.683030 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:45:14.682991 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf033a5d8_a3ec_47df_9593_01096396aeb5.slice/crio-6e05b492ca7d53f123816ccc7cc75cf26107be0bea247275f4b7427c55b3a064 WatchSource:0}: Error finding container 6e05b492ca7d53f123816ccc7cc75cf26107be0bea247275f4b7427c55b3a064: Status 404 returned error can't find the container with id 6e05b492ca7d53f123816ccc7cc75cf26107be0bea247275f4b7427c55b3a064 Mar 18 16:45:14.703935 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:14.703884 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ptdgh" podStartSLOduration=5.741336201 podStartE2EDuration="38.703870776s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.171896867 +0000 UTC m=+3.277548209" lastFinishedPulling="2026-03-18 16:45:12.134431442 +0000 UTC m=+36.240082784" observedRunningTime="2026-03-18 16:45:14.702328033 +0000 UTC m=+38.807979395" watchObservedRunningTime="2026-03-18 16:45:14.703870776 +0000 UTC m=+38.809522137" Mar 18 16:45:15.684994 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:15.684951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hdpjw" event={"ID":"f033a5d8-a3ec-47df-9593-01096396aeb5","Type":"ContainerStarted","Data":"6e05b492ca7d53f123816ccc7cc75cf26107be0bea247275f4b7427c55b3a064"} Mar 18 16:45:18.618852 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:18.618788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:18.619272 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:18.618870 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:18.619272 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:18.618931 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:18.619272 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:18.619002 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:26.618978911 +0000 UTC m=+50.724630255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:45:18.619272 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:18.619013 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:18.619272 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:18.619063 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:26.619049262 +0000 UTC m=+50.724700602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:45:18.692154 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:18.692112 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hdpjw" event={"ID":"f033a5d8-a3ec-47df-9593-01096396aeb5","Type":"ContainerStarted","Data":"e8ec6f5dc869e2f8b2ab5a0cd2ae342c0688361f78569122f5daefb2a20b9505"} Mar 18 16:45:18.707244 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:18.707178 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hdpjw" podStartSLOduration=33.210372518 podStartE2EDuration="36.707155125s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:45:14.684647681 +0000 UTC m=+38.790299023" lastFinishedPulling="2026-03-18 16:45:18.181430278 +0000 UTC m=+42.287081630" observedRunningTime="2026-03-18 16:45:18.706771424 +0000 UTC m=+42.812422783" watchObservedRunningTime="2026-03-18 16:45:18.707155125 +0000 UTC m=+42.812806488" Mar 18 16:45:26.676771 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:26.676723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:26.677255 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:26.676787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:26.677255 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:26.676833 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:26.677255 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:26.676884 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:26.677255 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:26.676902 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:42.676884377 +0000 UTC m=+66.782535716 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:45:26.677255 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:26.676924 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:42.67691072 +0000 UTC m=+66.782562059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:45:35.662235 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:35.662207 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gk7ln" Mar 18 16:45:42.182389 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.182344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:42.185393 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.185371 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:42.195686 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.195656 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:42.206297 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.206267 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtcfh\" (UniqueName: \"kubernetes.io/projected/b64a1006-8f55-41c1-9d77-457180e9a557-kube-api-access-vtcfh\") pod \"network-check-target-x724s\" (UID: \"b64a1006-8f55-41c1-9d77-457180e9a557\") " pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:42.283468 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.283421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:45:42.286248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.286230 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:42.294366 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:42.294341 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:42.294434 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:42.294420 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:46.29440224 +0000 UTC m=+130.400053579 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : secret "metrics-daemon-secret" not found Mar 18 16:45:42.429678 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.429645 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wwjb5\"" Mar 18 16:45:42.437774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.437688 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:42.555865 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.555832 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x724s"] Mar 18 16:45:42.559570 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:45:42.559537 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb64a1006_8f55_41c1_9d77_457180e9a557.slice/crio-3ea380dd19dabd4be3c95adf171dfa02329b35919536fb95ba91ae89289ecccb WatchSource:0}: Error finding container 3ea380dd19dabd4be3c95adf171dfa02329b35919536fb95ba91ae89289ecccb: Status 404 returned error can't find the container with id 3ea380dd19dabd4be3c95adf171dfa02329b35919536fb95ba91ae89289ecccb Mar 18 16:45:42.685895 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.685851 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:45:42.685895 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.685902 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:45:42.686125 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:42.685993 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:42.686125 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:42.685997 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:42.686125 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:42.686038 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:46:14.686024246 +0000 UTC m=+98.791675585 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:45:42.686125 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:45:42.686095 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:14.686083015 +0000 UTC m=+98.791734354 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:45:42.740317 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:42.740220 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x724s" event={"ID":"b64a1006-8f55-41c1-9d77-457180e9a557","Type":"ContainerStarted","Data":"3ea380dd19dabd4be3c95adf171dfa02329b35919536fb95ba91ae89289ecccb"} Mar 18 16:45:45.747677 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:45.747634 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x724s" event={"ID":"b64a1006-8f55-41c1-9d77-457180e9a557","Type":"ContainerStarted","Data":"469bf85ab00339bb0f9a92c50a5c96ec19566c1b83326bd3e1a86954929e45b2"} Mar 18 16:45:45.748058 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:45.747765 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:45:45.763688 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:45:45.763607 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-x724s" podStartSLOduration=67.201354181 podStartE2EDuration="1m9.763591051s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:45:42.561585693 +0000 UTC m=+66.667237032" lastFinishedPulling="2026-03-18 16:45:45.123822562 +0000 UTC m=+69.229473902" observedRunningTime="2026-03-18 16:45:45.763478112 +0000 UTC m=+69.869129487" watchObservedRunningTime="2026-03-18 16:45:45.763591051 +0000 UTC m=+69.869242412" Mar 18 16:46:14.706344 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:14.706280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:46:14.706344 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:14.706349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:46:14.706834 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:14.706454 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:46:14.706834 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:14.706454 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:46:14.706834 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:14.706527 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert podName:529555a4-f4da-4842-814f-1acffad52caf nodeName:}" failed. No retries permitted until 2026-03-18 16:47:18.706504442 +0000 UTC m=+162.812155781 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert") pod "ingress-canary-4vc72" (UID: "529555a4-f4da-4842-814f-1acffad52caf") : secret "canary-serving-cert" not found Mar 18 16:46:14.706834 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:14.706542 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls podName:d386ca96-8632-46cd-b756-90a53fad9ef1 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:18.706536387 +0000 UTC m=+162.812187727 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls") pod "dns-default-t4psl" (UID: "d386ca96-8632-46cd-b756-90a53fad9ef1") : secret "dns-default-metrics-tls" not found Mar 18 16:46:16.751981 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:16.751946 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x724s" Mar 18 16:46:36.067170 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.067133 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-p44bj"] Mar 18 16:46:36.069907 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.069889 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.072828 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.072794 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-mcps4\"" Mar 18 16:46:36.073214 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.073195 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.073536 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.073258 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.074199 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.074174 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Mar 18 16:46:36.074350 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.074339 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Mar 18 16:46:36.077765 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.077728 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Mar 18 16:46:36.078940 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.078910 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-p44bj"] Mar 18 16:46:36.153558 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.153511 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b81d0e21-b085-449c-848a-8150e032f670-service-ca-bundle\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.153558 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.153554 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b81d0e21-b085-449c-848a-8150e032f670-tmp\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.153864 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.153596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dthjx\" (UniqueName: \"kubernetes.io/projected/b81d0e21-b085-449c-848a-8150e032f670-kube-api-access-dthjx\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.153864 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.153649 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b81d0e21-b085-449c-848a-8150e032f670-snapshots\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.153864 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.153665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81d0e21-b085-449c-848a-8150e032f670-serving-cert\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.153864 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.153768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b81d0e21-b085-449c-848a-8150e032f670-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.172378 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.172336 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7bd4c46cc4-vslwk"] Mar 18 16:46:36.175197 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.175177 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b8565867-gjg6w"] Mar 18 16:46:36.175344 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.175327 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.177880 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.177852 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 18 16:46:36.178162 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.178146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sf568\"" Mar 18 16:46:36.178271 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.178195 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.178271 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.178214 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.178651 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.178633 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 18 16:46:36.178790 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.178773 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.178843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.178818 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 18 16:46:36.178895 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.178880 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Mar 18 16:46:36.180621 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.180601 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 18 16:46:36.180745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.180636 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.180815 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.180790 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-88662\"" Mar 18 16:46:36.181189 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.181170 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 18 16:46:36.181397 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.181379 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.189682 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.189648 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 18 16:46:36.190864 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.190839 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-gjg6w"] Mar 18 16:46:36.191596 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.191579 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bd4c46cc4-vslwk"] Mar 18 16:46:36.254554 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254516 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b81d0e21-b085-449c-848a-8150e032f670-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.254554 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254560 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-stats-auth\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.254843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b81d0e21-b085-449c-848a-8150e032f670-service-ca-bundle\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.254843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.254843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254611 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c78a76c-298c-468a-a6bd-98bc2950f67a-serving-cert\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.254843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b81d0e21-b085-449c-848a-8150e032f670-tmp\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.254843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254788 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c78a76c-298c-468a-a6bd-98bc2950f67a-config\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.254843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254842 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dthjx\" (UniqueName: \"kubernetes.io/projected/b81d0e21-b085-449c-848a-8150e032f670-kube-api-access-dthjx\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.255125 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmpp\" (UniqueName: \"kubernetes.io/projected/7b285194-6029-4441-b4e2-56fdcc973573-kube-api-access-9lmpp\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.255125 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254902 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-default-certificate\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.255125 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.254927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh57z\" (UniqueName: \"kubernetes.io/projected/0c78a76c-298c-468a-a6bd-98bc2950f67a-kube-api-access-hh57z\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.255125 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b81d0e21-b085-449c-848a-8150e032f670-snapshots\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.255125 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81d0e21-b085-449c-848a-8150e032f670-serving-cert\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.255125 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255114 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c78a76c-298c-468a-a6bd-98bc2950f67a-trusted-ca\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.255423 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.255423 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b81d0e21-b085-449c-848a-8150e032f670-tmp\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.255423 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b81d0e21-b085-449c-848a-8150e032f670-service-ca-bundle\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.255568 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255498 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b81d0e21-b085-449c-848a-8150e032f670-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.255612 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.255579 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b81d0e21-b085-449c-848a-8150e032f670-snapshots\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.257612 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.257589 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81d0e21-b085-449c-848a-8150e032f670-serving-cert\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.263036 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.263003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dthjx\" (UniqueName: \"kubernetes.io/projected/b81d0e21-b085-449c-848a-8150e032f670-kube-api-access-dthjx\") pod \"insights-operator-76bdd9f478-p44bj\" (UID: \"b81d0e21-b085-449c-848a-8150e032f670\") " pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.356510 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.356510 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356481 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-stats-auth\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.356510 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356507 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.356841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356529 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c78a76c-298c-468a-a6bd-98bc2950f67a-serving-cert\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.356841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356563 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c78a76c-298c-468a-a6bd-98bc2950f67a-config\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.356841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmpp\" (UniqueName: \"kubernetes.io/projected/7b285194-6029-4441-b4e2-56fdcc973573-kube-api-access-9lmpp\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.356841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-default-certificate\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.356841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh57z\" (UniqueName: \"kubernetes.io/projected/0c78a76c-298c-468a-a6bd-98bc2950f67a-kube-api-access-hh57z\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.356841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.356720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c78a76c-298c-468a-a6bd-98bc2950f67a-trusted-ca\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.359245 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.359214 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 18 16:46:36.359245 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.359232 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Mar 18 16:46:36.359245 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.359241 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 18 16:46:36.359245 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.359249 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 18 16:46:36.359490 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.359231 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 18 16:46:36.359490 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.359214 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 18 16:46:36.365399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.365049 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.365399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.365238 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 18 16:46:36.365399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.365384 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.366894 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:36.366865 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:36.366894 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:36.366947 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.866922637 +0000 UTC m=+120.972573976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:36.367198 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:36.367003 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.866975159 +0000 UTC m=+120.972626498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : secret "router-metrics-certs-default" not found Mar 18 16:46:36.368118 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.368090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c78a76c-298c-468a-a6bd-98bc2950f67a-config\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.368594 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.368570 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c78a76c-298c-468a-a6bd-98bc2950f67a-trusted-ca\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.369555 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.369531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-default-certificate\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.369555 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.369551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-stats-auth\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.370025 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.370009 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c78a76c-298c-468a-a6bd-98bc2950f67a-serving-cert\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.375860 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.375824 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.375860 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.375827 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.384597 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.384572 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-mcps4\"" Mar 18 16:46:36.387131 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.387105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmpp\" (UniqueName: \"kubernetes.io/projected/7b285194-6029-4441-b4e2-56fdcc973573-kube-api-access-9lmpp\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.387248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.387105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh57z\" (UniqueName: \"kubernetes.io/projected/0c78a76c-298c-468a-a6bd-98bc2950f67a-kube-api-access-hh57z\") pod \"console-operator-76b8565867-gjg6w\" (UID: \"0c78a76c-298c-468a-a6bd-98bc2950f67a\") " pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.392978 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.392954 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-p44bj" Mar 18 16:46:36.495093 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.495048 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-88662\"" Mar 18 16:46:36.503276 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.503241 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:36.513590 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.513558 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-p44bj"] Mar 18 16:46:36.517457 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:46:36.517427 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81d0e21_b085_449c_848a_8150e032f670.slice/crio-fa2022f62f09a974c817d46aad8b725d844267c94d16468f94646b797b004c22 WatchSource:0}: Error finding container fa2022f62f09a974c817d46aad8b725d844267c94d16468f94646b797b004c22: Status 404 returned error can't find the container with id fa2022f62f09a974c817d46aad8b725d844267c94d16468f94646b797b004c22 Mar 18 16:46:36.630540 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.630443 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-gjg6w"] Mar 18 16:46:36.636508 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:46:36.636475 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c78a76c_298c_468a_a6bd_98bc2950f67a.slice/crio-3ff8cbfd50f58d81c7d7c2d9c4f4120c91efdc4008c21ae2080756ea846cec66 WatchSource:0}: Error finding container 3ff8cbfd50f58d81c7d7c2d9c4f4120c91efdc4008c21ae2080756ea846cec66: Status 404 returned error can't find the container with id 3ff8cbfd50f58d81c7d7c2d9c4f4120c91efdc4008c21ae2080756ea846cec66 Mar 18 16:46:36.851616 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.851584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" event={"ID":"0c78a76c-298c-468a-a6bd-98bc2950f67a","Type":"ContainerStarted","Data":"3ff8cbfd50f58d81c7d7c2d9c4f4120c91efdc4008c21ae2080756ea846cec66"} Mar 18 16:46:36.852563 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.852539 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-p44bj" event={"ID":"b81d0e21-b085-449c-848a-8150e032f670","Type":"ContainerStarted","Data":"fa2022f62f09a974c817d46aad8b725d844267c94d16468f94646b797b004c22"} Mar 18 16:46:36.960554 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.960464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.960554 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:36.960521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:36.960804 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:36.960632 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:36.960804 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:36.960662 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:37.960638435 +0000 UTC m=+122.066289789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:36.960804 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:36.960730 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:37.960717421 +0000 UTC m=+122.066368764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : secret "router-metrics-certs-default" not found Mar 18 16:46:37.968958 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:37.968911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:37.969435 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:37.968984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:37.969435 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:37.969087 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:37.969435 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:37.969102 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:39.969079012 +0000 UTC m=+124.074730363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:37.969435 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:37.969139 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:39.969121355 +0000 UTC m=+124.074772695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : secret "router-metrics-certs-default" not found Mar 18 16:46:39.859820 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.859787 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-p44bj" event={"ID":"b81d0e21-b085-449c-848a-8150e032f670","Type":"ContainerStarted","Data":"2c59a468fb460012b45c5114ba0000f5f07394e2d24d06f4d884e34a0abe9da2"} Mar 18 16:46:39.861380 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.861353 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/0.log" Mar 18 16:46:39.861498 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.861403 2570 generic.go:358] "Generic (PLEG): container finished" podID="0c78a76c-298c-468a-a6bd-98bc2950f67a" containerID="7a5bea8690523713ac1d2d533364f0b790091a58fe7ae033ef75cc63cf9ebe86" exitCode=255 Mar 18 16:46:39.861498 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.861441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" event={"ID":"0c78a76c-298c-468a-a6bd-98bc2950f67a","Type":"ContainerDied","Data":"7a5bea8690523713ac1d2d533364f0b790091a58fe7ae033ef75cc63cf9ebe86"} Mar 18 16:46:39.861672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.861654 2570 scope.go:117] "RemoveContainer" containerID="7a5bea8690523713ac1d2d533364f0b790091a58fe7ae033ef75cc63cf9ebe86" Mar 18 16:46:39.877255 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.877201 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-76bdd9f478-p44bj" podStartSLOduration=1.56244005 podStartE2EDuration="3.877182797s" podCreationTimestamp="2026-03-18 16:46:36 +0000 UTC" firstStartedPulling="2026-03-18 16:46:36.519332822 +0000 UTC m=+120.624984165" lastFinishedPulling="2026-03-18 16:46:38.834075569 +0000 UTC m=+122.939726912" observedRunningTime="2026-03-18 16:46:39.876418696 +0000 UTC m=+123.982070058" watchObservedRunningTime="2026-03-18 16:46:39.877182797 +0000 UTC m=+123.982834159" Mar 18 16:46:39.983399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.983359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:39.983578 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:39.983465 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:39.983642 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:39.983573 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:43.983547599 +0000 UTC m=+128.089198954 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:39.984057 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:39.983867 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:39.984057 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:39.983920 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:43.983904044 +0000 UTC m=+128.089555386 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : secret "router-metrics-certs-default" not found Mar 18 16:46:40.865986 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:40.865955 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/1.log" Mar 18 16:46:40.866376 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:40.866343 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/0.log" Mar 18 16:46:40.866426 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:40.866375 2570 generic.go:358] "Generic (PLEG): container finished" podID="0c78a76c-298c-468a-a6bd-98bc2950f67a" containerID="1247fb894725604dd60174383e23e373e3a4b7ec7976680bbf5924695ee0f96c" exitCode=255 Mar 18 16:46:40.866489 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:40.866466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" event={"ID":"0c78a76c-298c-468a-a6bd-98bc2950f67a","Type":"ContainerDied","Data":"1247fb894725604dd60174383e23e373e3a4b7ec7976680bbf5924695ee0f96c"} Mar 18 16:46:40.866548 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:40.866510 2570 scope.go:117] "RemoveContainer" containerID="7a5bea8690523713ac1d2d533364f0b790091a58fe7ae033ef75cc63cf9ebe86" Mar 18 16:46:40.866779 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:40.866757 2570 scope.go:117] "RemoveContainer" containerID="1247fb894725604dd60174383e23e373e3a4b7ec7976680bbf5924695ee0f96c" Mar 18 16:46:40.866971 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:40.866954 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-gjg6w_openshift-console-operator(0c78a76c-298c-468a-a6bd-98bc2950f67a)\"" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" podUID="0c78a76c-298c-468a-a6bd-98bc2950f67a" Mar 18 16:46:41.869322 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:41.869293 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/1.log" Mar 18 16:46:41.869681 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:41.869638 2570 scope.go:117] "RemoveContainer" containerID="1247fb894725604dd60174383e23e373e3a4b7ec7976680bbf5924695ee0f96c" Mar 18 16:46:41.869862 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:41.869844 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-gjg6w_openshift-console-operator(0c78a76c-298c-468a-a6bd-98bc2950f67a)\"" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" podUID="0c78a76c-298c-468a-a6bd-98bc2950f67a" Mar 18 16:46:43.008928 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:43.008897 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-blj9x_896404ec-1cea-4a36-9c02-cb5316bac310/dns-node-resolver/0.log" Mar 18 16:46:43.809312 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:43.809285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8mt8g_aba28703-3193-48ac-bad1-170ac214d793/node-ca/0.log" Mar 18 16:46:44.017107 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.017062 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:44.017589 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.017130 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:44.017589 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:44.017229 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:44.017589 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:44.017259 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:52.017241096 +0000 UTC m=+136.122892436 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:44.017589 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:44.017297 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:52.017284353 +0000 UTC m=+136.122935699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : secret "router-metrics-certs-default" not found Mar 18 16:46:44.436009 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.435973 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-4z48v"] Mar 18 16:46:44.439928 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.439909 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.442359 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.442333 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 18 16:46:44.442565 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.442548 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-g2q4n\"" Mar 18 16:46:44.443634 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.443607 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 18 16:46:44.443765 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.443651 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 18 16:46:44.443765 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.443613 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 18 16:46:44.446812 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.446766 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-4z48v"] Mar 18 16:46:44.521435 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.521397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-signing-key\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.521621 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.521450 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-signing-cabundle\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.521621 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.521473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42tt\" (UniqueName: \"kubernetes.io/projected/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-kube-api-access-k42tt\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.622309 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.622266 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-signing-key\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.622478 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.622322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-signing-cabundle\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.622478 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.622343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k42tt\" (UniqueName: \"kubernetes.io/projected/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-kube-api-access-k42tt\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.623005 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.622975 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-signing-cabundle\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.624934 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.624913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-signing-key\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.631173 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.631143 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42tt\" (UniqueName: \"kubernetes.io/projected/266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d-kube-api-access-k42tt\") pod \"service-ca-8bb587b94-4z48v\" (UID: \"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d\") " pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.748984 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.748888 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-4z48v" Mar 18 16:46:44.875368 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:44.875335 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-4z48v"] Mar 18 16:46:44.878820 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:46:44.878787 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod266f28c6_3af2_4cbc_9ae3_a88c0e9a7e6d.slice/crio-25b70ebf224c0ab6e6e73523e351eaa813b2a60cc4823c6a5df221ae31dec202 WatchSource:0}: Error finding container 25b70ebf224c0ab6e6e73523e351eaa813b2a60cc4823c6a5df221ae31dec202: Status 404 returned error can't find the container with id 25b70ebf224c0ab6e6e73523e351eaa813b2a60cc4823c6a5df221ae31dec202 Mar 18 16:46:45.878958 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:45.878917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-4z48v" event={"ID":"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d","Type":"ContainerStarted","Data":"25b70ebf224c0ab6e6e73523e351eaa813b2a60cc4823c6a5df221ae31dec202"} Mar 18 16:46:46.338370 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:46.338330 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:46:46.338574 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:46.338516 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:46.338625 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:46.338602 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs podName:38dc6da4-4394-4935-80a5-6a872bf72125 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:48.338580083 +0000 UTC m=+252.444231430 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs") pod "network-metrics-daemon-ktpw5" (UID: "38dc6da4-4394-4935-80a5-6a872bf72125") : secret "metrics-daemon-secret" not found Mar 18 16:46:46.504060 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:46.504019 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:46.504060 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:46.504073 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:46:46.504567 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:46.504552 2570 scope.go:117] "RemoveContainer" containerID="1247fb894725604dd60174383e23e373e3a4b7ec7976680bbf5924695ee0f96c" Mar 18 16:46:46.504809 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:46.504786 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-gjg6w_openshift-console-operator(0c78a76c-298c-468a-a6bd-98bc2950f67a)\"" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" podUID="0c78a76c-298c-468a-a6bd-98bc2950f67a" Mar 18 16:46:47.887016 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:47.886976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-4z48v" event={"ID":"266f28c6-3af2-4cbc-9ae3-a88c0e9a7e6d","Type":"ContainerStarted","Data":"8c0d561f4754e43aa6647076e1473134fd93cfa730e5687c53737e9624b61639"} Mar 18 16:46:47.904063 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:47.904010 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-8bb587b94-4z48v" podStartSLOduration=1.976392752 podStartE2EDuration="3.90399285s" podCreationTimestamp="2026-03-18 16:46:44 +0000 UTC" firstStartedPulling="2026-03-18 16:46:44.880839796 +0000 UTC m=+128.986491136" lastFinishedPulling="2026-03-18 16:46:46.808439886 +0000 UTC m=+130.914091234" observedRunningTime="2026-03-18 16:46:47.903407197 +0000 UTC m=+132.009058558" watchObservedRunningTime="2026-03-18 16:46:47.90399285 +0000 UTC m=+132.009644213" Mar 18 16:46:52.086262 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:52.086215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:52.086683 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:46:52.086277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:46:52.086683 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:52.086389 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:52.086683 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:52.086397 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:08.086378152 +0000 UTC m=+152.192029501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:52.086683 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:46:52.086431 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs podName:7b285194-6029-4441-b4e2-56fdcc973573 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:08.086419885 +0000 UTC m=+152.192071225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs") pod "router-default-7bd4c46cc4-vslwk" (UID: "7b285194-6029-4441-b4e2-56fdcc973573") : secret "router-metrics-certs-default" not found Mar 18 16:47:00.515424 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:00.515389 2570 scope.go:117] "RemoveContainer" containerID="1247fb894725604dd60174383e23e373e3a4b7ec7976680bbf5924695ee0f96c" Mar 18 16:47:00.919052 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:00.919009 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:47:00.919414 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:00.919398 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/1.log" Mar 18 16:47:00.919466 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:00.919432 2570 generic.go:358] "Generic (PLEG): container finished" podID="0c78a76c-298c-468a-a6bd-98bc2950f67a" containerID="ebde252d0529a262dda015aa3bad71aeabb9e7380fcef3c16d730e3a9dd054b6" exitCode=255 Mar 18 16:47:00.919505 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:00.919491 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" event={"ID":"0c78a76c-298c-468a-a6bd-98bc2950f67a","Type":"ContainerDied","Data":"ebde252d0529a262dda015aa3bad71aeabb9e7380fcef3c16d730e3a9dd054b6"} Mar 18 16:47:00.919545 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:00.919526 2570 scope.go:117] "RemoveContainer" containerID="1247fb894725604dd60174383e23e373e3a4b7ec7976680bbf5924695ee0f96c" Mar 18 16:47:00.919899 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:00.919883 2570 scope.go:117] "RemoveContainer" containerID="ebde252d0529a262dda015aa3bad71aeabb9e7380fcef3c16d730e3a9dd054b6" Mar 18 16:47:00.920060 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:00.920042 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-gjg6w_openshift-console-operator(0c78a76c-298c-468a-a6bd-98bc2950f67a)\"" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" podUID="0c78a76c-298c-468a-a6bd-98bc2950f67a" Mar 18 16:47:01.924014 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:01.923989 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:47:06.503824 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:06.503778 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:47:06.503824 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:06.503822 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:47:06.504245 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:06.504218 2570 scope.go:117] "RemoveContainer" containerID="ebde252d0529a262dda015aa3bad71aeabb9e7380fcef3c16d730e3a9dd054b6" Mar 18 16:47:06.504433 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:06.504413 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-gjg6w_openshift-console-operator(0c78a76c-298c-468a-a6bd-98bc2950f67a)\"" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" podUID="0c78a76c-298c-468a-a6bd-98bc2950f67a" Mar 18 16:47:08.119836 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.119803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:08.120234 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.119858 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:08.120502 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.120474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b285194-6029-4441-b4e2-56fdcc973573-service-ca-bundle\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:08.122536 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.122509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b285194-6029-4441-b4e2-56fdcc973573-metrics-certs\") pod \"router-default-7bd4c46cc4-vslwk\" (UID: \"7b285194-6029-4441-b4e2-56fdcc973573\") " pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:08.288916 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.288882 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sf568\"" Mar 18 16:47:08.296919 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.296859 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:08.428119 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.428088 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bd4c46cc4-vslwk"] Mar 18 16:47:08.431566 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:08.431521 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b285194_6029_4441_b4e2_56fdcc973573.slice/crio-6235da774216919c16b33b0c9d58265bcfe3d2364f34735d652d4d51f6c40e96 WatchSource:0}: Error finding container 6235da774216919c16b33b0c9d58265bcfe3d2364f34735d652d4d51f6c40e96: Status 404 returned error can't find the container with id 6235da774216919c16b33b0c9d58265bcfe3d2364f34735d652d4d51f6c40e96 Mar 18 16:47:08.941526 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.941490 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" event={"ID":"7b285194-6029-4441-b4e2-56fdcc973573","Type":"ContainerStarted","Data":"57007c16674214ec4bb0b1c9e7be429e5292585c3022522e6e3a344d1d25df9d"} Mar 18 16:47:08.941526 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.941527 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" event={"ID":"7b285194-6029-4441-b4e2-56fdcc973573","Type":"ContainerStarted","Data":"6235da774216919c16b33b0c9d58265bcfe3d2364f34735d652d4d51f6c40e96"} Mar 18 16:47:08.962378 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:08.962323 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" podStartSLOduration=32.962306767 podStartE2EDuration="32.962306767s" podCreationTimestamp="2026-03-18 16:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:08.960500341 +0000 UTC m=+153.066151702" watchObservedRunningTime="2026-03-18 16:47:08.962306767 +0000 UTC m=+153.067958129" Mar 18 16:47:09.297674 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:09.297639 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:09.300380 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:09.300353 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:09.943842 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:09.943795 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:09.945185 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:09.945160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7bd4c46cc4-vslwk" Mar 18 16:47:10.501306 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.501270 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-tb98w"] Mar 18 16:47:10.503278 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.503257 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.506649 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.506625 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 18 16:47:10.506649 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.506628 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 18 16:47:10.506912 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.506671 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l8rcf\"" Mar 18 16:47:10.512405 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.512379 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-tb98w"] Mar 18 16:47:10.539487 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.539448 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f9f34891-1c31-4c9e-9365-63ede3d6127d-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-tb98w\" (UID: \"f9f34891-1c31-4c9e-9365-63ede3d6127d\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.539487 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.539491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f9f34891-1c31-4c9e-9365-63ede3d6127d-nginx-conf\") pod \"networking-console-plugin-55b77584bb-tb98w\" (UID: \"f9f34891-1c31-4c9e-9365-63ede3d6127d\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.602776 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.602739 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2"] Mar 18 16:47:10.604586 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.604565 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" Mar 18 16:47:10.608447 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.608422 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-ccwgq\"" Mar 18 16:47:10.614663 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.614636 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-55qz2"] Mar 18 16:47:10.616777 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.616737 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx"] Mar 18 16:47:10.616925 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.616859 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.618948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.618922 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2"] Mar 18 16:47:10.620139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.619070 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:10.620139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.619663 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:10.620139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.619894 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h8gkt\"" Mar 18 16:47:10.620139 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.619929 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:10.621843 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.621780 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Mar 18 16:47:10.622047 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.622028 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-gcdqx\"" Mar 18 16:47:10.626972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.626930 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-55qz2"] Mar 18 16:47:10.633199 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.633168 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx"] Mar 18 16:47:10.640511 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640478 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/477b54c8-2376-4b83-b755-27327a399096-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.640665 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640523 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/477b54c8-2376-4b83-b755-27327a399096-crio-socket\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.640665 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f9f34891-1c31-4c9e-9365-63ede3d6127d-nginx-conf\") pod \"networking-console-plugin-55b77584bb-tb98w\" (UID: \"f9f34891-1c31-4c9e-9365-63ede3d6127d\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.640810 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640679 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cch8p\" (UniqueName: \"kubernetes.io/projected/548137bf-b85f-4d92-9e10-f9cc858486fb-kube-api-access-cch8p\") pod \"network-check-source-cc88fdd44-km9x2\" (UID: \"548137bf-b85f-4d92-9e10-f9cc858486fb\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" Mar 18 16:47:10.640810 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f9f34891-1c31-4c9e-9365-63ede3d6127d-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-tb98w\" (UID: \"f9f34891-1c31-4c9e-9365-63ede3d6127d\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.640810 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/477b54c8-2376-4b83-b755-27327a399096-data-volume\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.640972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640816 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/477b54c8-2376-4b83-b755-27327a399096-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.640972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tv9m\" (UniqueName: \"kubernetes.io/projected/477b54c8-2376-4b83-b755-27327a399096-kube-api-access-9tv9m\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.640972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.640878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7d73f442-384c-4654-89da-c1341a2fac11-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-pksvx\" (UID: \"7d73f442-384c-4654-89da-c1341a2fac11\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:10.641366 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.641329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f9f34891-1c31-4c9e-9365-63ede3d6127d-nginx-conf\") pod \"networking-console-plugin-55b77584bb-tb98w\" (UID: \"f9f34891-1c31-4c9e-9365-63ede3d6127d\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.643886 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.643858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f9f34891-1c31-4c9e-9365-63ede3d6127d-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-tb98w\" (UID: \"f9f34891-1c31-4c9e-9365-63ede3d6127d\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.741205 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cch8p\" (UniqueName: \"kubernetes.io/projected/548137bf-b85f-4d92-9e10-f9cc858486fb-kube-api-access-cch8p\") pod \"network-check-source-cc88fdd44-km9x2\" (UID: \"548137bf-b85f-4d92-9e10-f9cc858486fb\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" Mar 18 16:47:10.741205 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741197 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/477b54c8-2376-4b83-b755-27327a399096-data-volume\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/477b54c8-2376-4b83-b755-27327a399096-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741244 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tv9m\" (UniqueName: \"kubernetes.io/projected/477b54c8-2376-4b83-b755-27327a399096-kube-api-access-9tv9m\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741262 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7d73f442-384c-4654-89da-c1341a2fac11-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-pksvx\" (UID: \"7d73f442-384c-4654-89da-c1341a2fac11\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741325 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/477b54c8-2376-4b83-b755-27327a399096-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741348 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/477b54c8-2376-4b83-b755-27327a399096-crio-socket\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:10.741430 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741442 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/477b54c8-2376-4b83-b755-27327a399096-crio-socket\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.741538 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:10.741493 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d73f442-384c-4654-89da-c1341a2fac11-tls-certificates podName:7d73f442-384c-4654-89da-c1341a2fac11 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:11.241474169 +0000 UTC m=+155.347125514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/7d73f442-384c-4654-89da-c1341a2fac11-tls-certificates") pod "prometheus-operator-admission-webhook-8444df798b-pksvx" (UID: "7d73f442-384c-4654-89da-c1341a2fac11") : secret "prometheus-operator-admission-webhook-tls" not found Mar 18 16:47:10.741985 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/477b54c8-2376-4b83-b755-27327a399096-data-volume\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.741985 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.741958 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/477b54c8-2376-4b83-b755-27327a399096-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.744067 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.744038 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/477b54c8-2376-4b83-b755-27327a399096-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.752434 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.752363 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tv9m\" (UniqueName: \"kubernetes.io/projected/477b54c8-2376-4b83-b755-27327a399096-kube-api-access-9tv9m\") pod \"insights-runtime-extractor-55qz2\" (UID: \"477b54c8-2376-4b83-b755-27327a399096\") " pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.752575 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.752551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cch8p\" (UniqueName: \"kubernetes.io/projected/548137bf-b85f-4d92-9e10-f9cc858486fb-kube-api-access-cch8p\") pod \"network-check-source-cc88fdd44-km9x2\" (UID: \"548137bf-b85f-4d92-9e10-f9cc858486fb\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" Mar 18 16:47:10.812315 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.812275 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" Mar 18 16:47:10.915840 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.915791 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" Mar 18 16:47:10.932575 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.932542 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-55qz2" Mar 18 16:47:10.951238 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:10.951204 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-tb98w"] Mar 18 16:47:10.956003 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:10.955965 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f34891_1c31_4c9e_9365_63ede3d6127d.slice/crio-4b94dca72b5da82d3eefd34586d38d2553a6656221848c6795e0e0e0efbfb203 WatchSource:0}: Error finding container 4b94dca72b5da82d3eefd34586d38d2553a6656221848c6795e0e0e0efbfb203: Status 404 returned error can't find the container with id 4b94dca72b5da82d3eefd34586d38d2553a6656221848c6795e0e0e0efbfb203 Mar 18 16:47:11.060784 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.060719 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2"] Mar 18 16:47:11.065324 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:11.065290 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod548137bf_b85f_4d92_9e10_f9cc858486fb.slice/crio-9790fd70c414479437b2dbf9c53e96eb6bc01331a77c0c95cc518ddd14b6b5d8 WatchSource:0}: Error finding container 9790fd70c414479437b2dbf9c53e96eb6bc01331a77c0c95cc518ddd14b6b5d8: Status 404 returned error can't find the container with id 9790fd70c414479437b2dbf9c53e96eb6bc01331a77c0c95cc518ddd14b6b5d8 Mar 18 16:47:11.096185 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.096151 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-55qz2"] Mar 18 16:47:11.103885 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:11.103854 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477b54c8_2376_4b83_b755_27327a399096.slice/crio-8d96796c2d34aa32e4290163dd1bd37c0a9b3cbcea53427ff5645f82e0c52ab0 WatchSource:0}: Error finding container 8d96796c2d34aa32e4290163dd1bd37c0a9b3cbcea53427ff5645f82e0c52ab0: Status 404 returned error can't find the container with id 8d96796c2d34aa32e4290163dd1bd37c0a9b3cbcea53427ff5645f82e0c52ab0 Mar 18 16:47:11.245290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.245252 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7d73f442-384c-4654-89da-c1341a2fac11-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-pksvx\" (UID: \"7d73f442-384c-4654-89da-c1341a2fac11\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:11.247859 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.247835 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7d73f442-384c-4654-89da-c1341a2fac11-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-pksvx\" (UID: \"7d73f442-384c-4654-89da-c1341a2fac11\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:11.541007 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.540963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:11.685754 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.685717 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx"] Mar 18 16:47:11.696680 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:11.696637 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d73f442_384c_4654_89da_c1341a2fac11.slice/crio-96a2de9cc39fc80979c547c8aaabcc9e15db20bb8d49a60f99463a27789c360a WatchSource:0}: Error finding container 96a2de9cc39fc80979c547c8aaabcc9e15db20bb8d49a60f99463a27789c360a: Status 404 returned error can't find the container with id 96a2de9cc39fc80979c547c8aaabcc9e15db20bb8d49a60f99463a27789c360a Mar 18 16:47:11.958392 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.958341 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" event={"ID":"f9f34891-1c31-4c9e-9365-63ede3d6127d","Type":"ContainerStarted","Data":"4b94dca72b5da82d3eefd34586d38d2553a6656221848c6795e0e0e0efbfb203"} Mar 18 16:47:11.960660 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.960630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-55qz2" event={"ID":"477b54c8-2376-4b83-b755-27327a399096","Type":"ContainerStarted","Data":"7f7f4ba41cbb167968ebab7c327bae2c73e3ae2b48616468fb0fc61accf8dbd9"} Mar 18 16:47:11.960660 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.960664 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-55qz2" event={"ID":"477b54c8-2376-4b83-b755-27327a399096","Type":"ContainerStarted","Data":"e3cc2ed260f9800352f7582802915d7d010bf246ca1aad376edba9696e2332b4"} Mar 18 16:47:11.960890 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.960673 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-55qz2" event={"ID":"477b54c8-2376-4b83-b755-27327a399096","Type":"ContainerStarted","Data":"8d96796c2d34aa32e4290163dd1bd37c0a9b3cbcea53427ff5645f82e0c52ab0"} Mar 18 16:47:11.962353 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.962319 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" event={"ID":"548137bf-b85f-4d92-9e10-f9cc858486fb","Type":"ContainerStarted","Data":"e9900e47cfc8994984159ab8cf94addb00529f757c00a14a285c3ee88ddaac0c"} Mar 18 16:47:11.962470 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.962374 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" event={"ID":"548137bf-b85f-4d92-9e10-f9cc858486fb","Type":"ContainerStarted","Data":"9790fd70c414479437b2dbf9c53e96eb6bc01331a77c0c95cc518ddd14b6b5d8"} Mar 18 16:47:11.963774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.963748 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" event={"ID":"7d73f442-384c-4654-89da-c1341a2fac11","Type":"ContainerStarted","Data":"96a2de9cc39fc80979c547c8aaabcc9e15db20bb8d49a60f99463a27789c360a"} Mar 18 16:47:11.979476 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:11.979408 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-km9x2" podStartSLOduration=1.97938853 podStartE2EDuration="1.97938853s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:11.97831285 +0000 UTC m=+156.083964212" watchObservedRunningTime="2026-03-18 16:47:11.97938853 +0000 UTC m=+156.085039887" Mar 18 16:47:13.812135 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:13.812079 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t4psl" podUID="d386ca96-8632-46cd-b756-90a53fad9ef1" Mar 18 16:47:13.823287 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:13.823234 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4vc72" podUID="529555a4-f4da-4842-814f-1acffad52caf" Mar 18 16:47:13.970833 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:13.970793 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-55qz2" event={"ID":"477b54c8-2376-4b83-b755-27327a399096","Type":"ContainerStarted","Data":"f1d12623d737f51727f0fa25a0d5420e597b095a3cb4398ee48f8767e63ade37"} Mar 18 16:47:13.972286 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:13.972259 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" event={"ID":"7d73f442-384c-4654-89da-c1341a2fac11","Type":"ContainerStarted","Data":"55fc520cab97b5dbc67ae0f52d2a07f7238b468791f550a4fc096b04115d4d48"} Mar 18 16:47:13.972427 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:13.972399 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:13.973754 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:13.973729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" event={"ID":"f9f34891-1c31-4c9e-9365-63ede3d6127d","Type":"ContainerStarted","Data":"e9acf7266ffad7c7d99274177aea2df2df4a114cd2a8871ca3eb14adc143686d"} Mar 18 16:47:13.973863 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:13.973757 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4psl" Mar 18 16:47:13.978263 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:13.978235 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" Mar 18 16:47:13.989653 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:13.989598 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-55qz2" podStartSLOduration=1.9023652 podStartE2EDuration="3.989581175s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:11.150691475 +0000 UTC m=+155.256342818" lastFinishedPulling="2026-03-18 16:47:13.237907451 +0000 UTC m=+157.343558793" observedRunningTime="2026-03-18 16:47:13.989541833 +0000 UTC m=+158.095193195" watchObservedRunningTime="2026-03-18 16:47:13.989581175 +0000 UTC m=+158.095232538" Mar 18 16:47:14.010204 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:14.010132 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-pksvx" podStartSLOduration=2.475076311 podStartE2EDuration="4.010111392s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:11.698724891 +0000 UTC m=+155.804376231" lastFinishedPulling="2026-03-18 16:47:13.23375997 +0000 UTC m=+157.339411312" observedRunningTime="2026-03-18 16:47:14.009637688 +0000 UTC m=+158.115289050" watchObservedRunningTime="2026-03-18 16:47:14.010111392 +0000 UTC m=+158.115762755" Mar 18 16:47:14.035954 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:14.035884 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-55b77584bb-tb98w" podStartSLOduration=1.763212751 podStartE2EDuration="4.035861662s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:10.957861022 +0000 UTC m=+155.063512363" lastFinishedPulling="2026-03-18 16:47:13.230509928 +0000 UTC m=+157.336161274" observedRunningTime="2026-03-18 16:47:14.034815308 +0000 UTC m=+158.140466670" watchObservedRunningTime="2026-03-18 16:47:14.035861662 +0000 UTC m=+158.141513027" Mar 18 16:47:14.541216 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:14.541163 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ktpw5" podUID="38dc6da4-4394-4935-80a5-6a872bf72125" Mar 18 16:47:18.795998 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:18.795960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:47:18.796421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:18.796023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:47:18.798801 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:18.798763 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d386ca96-8632-46cd-b756-90a53fad9ef1-metrics-tls\") pod \"dns-default-t4psl\" (UID: \"d386ca96-8632-46cd-b756-90a53fad9ef1\") " pod="openshift-dns/dns-default-t4psl" Mar 18 16:47:18.798953 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:18.798932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529555a4-f4da-4842-814f-1acffad52caf-cert\") pod \"ingress-canary-4vc72\" (UID: \"529555a4-f4da-4842-814f-1acffad52caf\") " pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:47:19.080313 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.080221 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpjpl\"" Mar 18 16:47:19.085467 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.085442 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4psl" Mar 18 16:47:19.231361 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.231325 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t4psl"] Mar 18 16:47:19.240385 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:19.240348 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd386ca96_8632_46cd_b756_90a53fad9ef1.slice/crio-fb4b5c41dc1e682b556411d72a6c7fabe0cba5aa1e6cd3b7c2ea056664110ccf WatchSource:0}: Error finding container fb4b5c41dc1e682b556411d72a6c7fabe0cba5aa1e6cd3b7c2ea056664110ccf: Status 404 returned error can't find the container with id fb4b5c41dc1e682b556411d72a6c7fabe0cba5aa1e6cd3b7c2ea056664110ccf Mar 18 16:47:19.381895 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.381184 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v"] Mar 18 16:47:19.384755 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.384731 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.393288 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.393255 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Mar 18 16:47:19.393903 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.393876 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:47:19.394380 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.394355 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:19.394509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.394381 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-c68rl\"" Mar 18 16:47:19.394509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.394355 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:47:19.394772 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.394757 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:47:19.400938 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.400901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.401098 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.400963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.401098 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.401073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.401202 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.401107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq8kb\" (UniqueName: \"kubernetes.io/projected/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-kube-api-access-rq8kb\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.411526 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.411493 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x"] Mar 18 16:47:19.413690 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.413668 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.418566 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.418532 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:47:19.418755 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.418574 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Mar 18 16:47:19.420351 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.420331 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-rzvmn\"" Mar 18 16:47:19.420989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.420833 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Mar 18 16:47:19.432666 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.432634 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v"] Mar 18 16:47:19.433989 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.433964 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x"] Mar 18 16:47:19.435148 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.435121 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-r5scn"] Mar 18 16:47:19.437665 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.437643 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.441087 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.441064 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:19.441583 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.441564 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-775mt\"" Mar 18 16:47:19.441936 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.441901 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:19.442199 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.442145 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:19.502305 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502259 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.502305 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq8kb\" (UniqueName: \"kubernetes.io/projected/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-kube-api-access-rq8kb\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502395 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlnvz\" (UniqueName: \"kubernetes.io/projected/74b771c1-e823-4290-b421-6cb942a7ae44-kube-api-access-tlnvz\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-wtmp\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502460 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j457c\" (UniqueName: \"kubernetes.io/projected/89e76f6d-b92d-47b3-b48e-389e2a9574b6-kube-api-access-j457c\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-textfile\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502514 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-tls\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.502571 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502636 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74b771c1-e823-4290-b421-6cb942a7ae44-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502754 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502792 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-accelerators-collector-config\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502846 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-sys\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502868 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-root\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89e76f6d-b92d-47b3-b48e-389e2a9574b6-metrics-client-ca\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502934 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/74b771c1-e823-4290-b421-6cb942a7ae44-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.503000 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.502959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.503300 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.503006 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.505135 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.505109 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.505231 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.505157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.505231 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.505173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.515420 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.515390 2570 scope.go:117] "RemoveContainer" containerID="ebde252d0529a262dda015aa3bad71aeabb9e7380fcef3c16d730e3a9dd054b6" Mar 18 16:47:19.515611 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:47:19.515594 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-gjg6w_openshift-console-operator(0c78a76c-298c-468a-a6bd-98bc2950f67a)\"" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" podUID="0c78a76c-298c-468a-a6bd-98bc2950f67a" Mar 18 16:47:19.528834 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.528801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq8kb\" (UniqueName: \"kubernetes.io/projected/1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8-kube-api-access-rq8kb\") pod \"openshift-state-metrics-68b5d5d464-jmc2v\" (UID: \"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.603498 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-textfile\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.603498 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603502 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-tls\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.603819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74b771c1-e823-4290-b421-6cb942a7ae44-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.603819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-accelerators-collector-config\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.603819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-sys\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.603819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-root\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.603819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89e76f6d-b92d-47b3-b48e-389e2a9574b6-metrics-client-ca\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.603819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-sys\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.603819 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-root\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.604384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/74b771c1-e823-4290-b421-6cb942a7ae44-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.604546 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74b771c1-e823-4290-b421-6cb942a7ae44-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.604547 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-textfile\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.603748 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/74b771c1-e823-4290-b421-6cb942a7ae44-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.604796 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.604847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.604886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.604919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.605114 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnvz\" (UniqueName: \"kubernetes.io/projected/74b771c1-e823-4290-b421-6cb942a7ae44-kube-api-access-tlnvz\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.605117 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-accelerators-collector-config\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.605210 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89e76f6d-b92d-47b3-b48e-389e2a9574b6-metrics-client-ca\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.605159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-wtmp\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.605738 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.605329 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j457c\" (UniqueName: \"kubernetes.io/projected/89e76f6d-b92d-47b3-b48e-389e2a9574b6-kube-api-access-j457c\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.606661 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.605886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.606661 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.606352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-wtmp\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.611455 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.608555 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.611455 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.608856 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.611455 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.609159 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89e76f6d-b92d-47b3-b48e-389e2a9574b6-node-exporter-tls\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.611825 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.611804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b771c1-e823-4290-b421-6cb942a7ae44-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.623299 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.623268 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j457c\" (UniqueName: \"kubernetes.io/projected/89e76f6d-b92d-47b3-b48e-389e2a9574b6-kube-api-access-j457c\") pod \"node-exporter-r5scn\" (UID: \"89e76f6d-b92d-47b3-b48e-389e2a9574b6\") " pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.623472 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.623397 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlnvz\" (UniqueName: \"kubernetes.io/projected/74b771c1-e823-4290-b421-6cb942a7ae44-kube-api-access-tlnvz\") pod \"kube-state-metrics-6df7999c47-fpf7x\" (UID: \"74b771c1-e823-4290-b421-6cb942a7ae44\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.696026 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.695915 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" Mar 18 16:47:19.724891 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.724853 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" Mar 18 16:47:19.749657 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.749622 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r5scn" Mar 18 16:47:19.885661 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.885619 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v"] Mar 18 16:47:19.889263 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:19.889224 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be80e0a_d7cc_40fa_8ab9_5da6eceaddc8.slice/crio-9ea267b96ac5cbcedd0f177288dbc4ab43f8ec8d955877984d368e867a53f8e6 WatchSource:0}: Error finding container 9ea267b96ac5cbcedd0f177288dbc4ab43f8ec8d955877984d368e867a53f8e6: Status 404 returned error can't find the container with id 9ea267b96ac5cbcedd0f177288dbc4ab43f8ec8d955877984d368e867a53f8e6 Mar 18 16:47:19.913879 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.913835 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x"] Mar 18 16:47:19.925952 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:19.925905 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74b771c1_e823_4290_b421_6cb942a7ae44.slice/crio-0c3f91613ac306364c90c9ef81d2adf03b1e471d117f68c2fae2a48b12bddff9 WatchSource:0}: Error finding container 0c3f91613ac306364c90c9ef81d2adf03b1e471d117f68c2fae2a48b12bddff9: Status 404 returned error can't find the container with id 0c3f91613ac306364c90c9ef81d2adf03b1e471d117f68c2fae2a48b12bddff9 Mar 18 16:47:19.991168 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.991126 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4psl" event={"ID":"d386ca96-8632-46cd-b756-90a53fad9ef1","Type":"ContainerStarted","Data":"fb4b5c41dc1e682b556411d72a6c7fabe0cba5aa1e6cd3b7c2ea056664110ccf"} Mar 18 16:47:19.992629 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.992512 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r5scn" event={"ID":"89e76f6d-b92d-47b3-b48e-389e2a9574b6","Type":"ContainerStarted","Data":"9a3396a08649d9a14ad4a20fd7b39cdfb16aafa5d5b6c2aada32c70cfab56725"} Mar 18 16:47:19.994579 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.994538 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" event={"ID":"74b771c1-e823-4290-b421-6cb942a7ae44","Type":"ContainerStarted","Data":"0c3f91613ac306364c90c9ef81d2adf03b1e471d117f68c2fae2a48b12bddff9"} Mar 18 16:47:19.996112 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.996084 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" event={"ID":"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8","Type":"ContainerStarted","Data":"029a73054658a01ab3131306160e14a7022a8feab25cdaa66871a196382a5942"} Mar 18 16:47:19.996112 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:19.996114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" event={"ID":"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8","Type":"ContainerStarted","Data":"9ea267b96ac5cbcedd0f177288dbc4ab43f8ec8d955877984d368e867a53f8e6"} Mar 18 16:47:20.588105 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.588069 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:20.592412 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.592375 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.599233 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.599197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:47:20.599392 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.599197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:47:20.599392 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.599197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:47:20.599669 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.599649 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:47:20.599818 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.599725 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:47:20.600301 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.600103 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:47:20.600614 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.600442 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-g7wzf\"" Mar 18 16:47:20.600614 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.600519 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:47:20.602638 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.602616 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:47:20.602768 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.602671 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:47:20.615547 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615515 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.615547 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-volume\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.615808 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.615808 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615642 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.615808 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.615808 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615762 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.615808 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615795 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.616014 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbg9c\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-kube-api-access-bbg9c\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.616014 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-out\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.616014 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615858 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-web-config\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.616014 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.616014 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615895 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.616014 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.615952 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.640101 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.640064 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-volume\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716629 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716840 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbg9c\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-kube-api-access-bbg9c\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716868 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-out\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-web-config\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716920 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.717114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.716987 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.720622 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.718968 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.720622 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.719410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.721564 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.724053 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-web-config\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.724097 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.724944 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.724997 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-volume\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.725492 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.726782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.726959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.726956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-out\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.727397 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.727214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.727397 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.727340 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.733662 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.733632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbg9c\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-kube-api-access-bbg9c\") pod \"alertmanager-main-0\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:20.906105 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:20.905527 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:21.009665 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:21.009564 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4psl" event={"ID":"d386ca96-8632-46cd-b756-90a53fad9ef1","Type":"ContainerStarted","Data":"8e571fcffb88fece53354413e175199301e160c317eb54b7cc41fb145fa28ef5"} Mar 18 16:47:21.019257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:21.019186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" event={"ID":"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8","Type":"ContainerStarted","Data":"bfc0dd27dfafdb304bff0a416f4bfc8543c75ecbbade6b4208af58ec05daf781"} Mar 18 16:47:21.140997 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:21.140959 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:21.190029 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:21.189933 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23354495_d4ad_4d6d_a1e4_49d52c3791bb.slice/crio-a0a5b6ae27782e4a04c885c10b86e815d62736e245c29ed11c7abea30648cc98 WatchSource:0}: Error finding container a0a5b6ae27782e4a04c885c10b86e815d62736e245c29ed11c7abea30648cc98: Status 404 returned error can't find the container with id a0a5b6ae27782e4a04c885c10b86e815d62736e245c29ed11c7abea30648cc98 Mar 18 16:47:22.025491 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.025197 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" event={"ID":"1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8","Type":"ContainerStarted","Data":"833f7fc127643339a4342ac3ca64945dccfdc596840a78ca5742e905ad5e9a0b"} Mar 18 16:47:22.029690 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.029572 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerStarted","Data":"a0a5b6ae27782e4a04c885c10b86e815d62736e245c29ed11c7abea30648cc98"} Mar 18 16:47:22.036359 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.036257 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4psl" event={"ID":"d386ca96-8632-46cd-b756-90a53fad9ef1","Type":"ContainerStarted","Data":"c858f44286b54e167c7a2ab7ab6ed8ec81dfa479cb5da680f68425a30fda2abc"} Mar 18 16:47:22.036359 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.036306 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t4psl" Mar 18 16:47:22.039289 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.039230 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r5scn" event={"ID":"89e76f6d-b92d-47b3-b48e-389e2a9574b6","Type":"ContainerStarted","Data":"8900f5726d8127347505d45d9fcac6d20ece83b0c38c7eb307ba6d0534f00171"} Mar 18 16:47:22.042039 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.041984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" event={"ID":"74b771c1-e823-4290-b421-6cb942a7ae44","Type":"ContainerStarted","Data":"04ba90c2d4ede8ab32a69cdef9a820e65b2a49418762f79fffb58f9bd0fd29db"} Mar 18 16:47:22.042190 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.042049 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" event={"ID":"74b771c1-e823-4290-b421-6cb942a7ae44","Type":"ContainerStarted","Data":"5f0fb02627bafd19870755f521baa8bfbb351c976b85d82e110b9c92a198c7bc"} Mar 18 16:47:22.161606 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.161549 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-jmc2v" podStartSLOduration=1.4341893350000001 podStartE2EDuration="3.161529796s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:20.069160432 +0000 UTC m=+164.174811790" lastFinishedPulling="2026-03-18 16:47:21.796500896 +0000 UTC m=+165.902152251" observedRunningTime="2026-03-18 16:47:22.12773738 +0000 UTC m=+166.233388742" watchObservedRunningTime="2026-03-18 16:47:22.161529796 +0000 UTC m=+166.267181159" Mar 18 16:47:22.304605 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.304537 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t4psl" podStartSLOduration=130.881516173 podStartE2EDuration="2m12.304521468s" podCreationTimestamp="2026-03-18 16:45:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.242422006 +0000 UTC m=+163.348073352" lastFinishedPulling="2026-03-18 16:47:20.665427291 +0000 UTC m=+164.771078647" observedRunningTime="2026-03-18 16:47:22.301980471 +0000 UTC m=+166.407631835" watchObservedRunningTime="2026-03-18 16:47:22.304521468 +0000 UTC m=+166.410172829" Mar 18 16:47:22.566732 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.566608 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2"] Mar 18 16:47:22.570289 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.570269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.574052 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.574025 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Mar 18 16:47:22.574207 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.574177 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-qddgq\"" Mar 18 16:47:22.574440 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.574420 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Mar 18 16:47:22.574528 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.574440 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Mar 18 16:47:22.574528 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.574485 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-52ab8afu9nn0a\"" Mar 18 16:47:22.574779 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.574763 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Mar 18 16:47:22.574815 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.574783 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Mar 18 16:47:22.592250 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.592213 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2"] Mar 18 16:47:22.640767 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.640729 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.640948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.640793 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8hxw\" (UniqueName: \"kubernetes.io/projected/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-kube-api-access-r8hxw\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.640948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.640926 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.641058 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.640968 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.641058 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.641002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-grpc-tls\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.641177 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.641114 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.641177 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.641146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-metrics-client-ca\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.641271 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.641194 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-tls\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.742523 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.742465 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.742523 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.742519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-grpc-tls\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.742815 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.742714 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.742815 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.742755 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-metrics-client-ca\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.742894 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.742803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-tls\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.742894 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.742861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.742983 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.742913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8hxw\" (UniqueName: \"kubernetes.io/projected/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-kube-api-access-r8hxw\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.743035 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.743006 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.743625 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.743575 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-metrics-client-ca\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.746027 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.746004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-tls\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.746145 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.746068 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-grpc-tls\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.746145 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.746129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.746220 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.746170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.746261 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.746240 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.746298 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.746265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.761293 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.761254 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8hxw\" (UniqueName: \"kubernetes.io/projected/0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9-kube-api-access-r8hxw\") pod \"thanos-querier-6c4d988b9b-7xpv2\" (UID: \"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9\") " pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:22.923255 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:22.923148 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:23.046750 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.046656 2570 generic.go:358] "Generic (PLEG): container finished" podID="89e76f6d-b92d-47b3-b48e-389e2a9574b6" containerID="8900f5726d8127347505d45d9fcac6d20ece83b0c38c7eb307ba6d0534f00171" exitCode=0 Mar 18 16:47:23.047291 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.046781 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r5scn" event={"ID":"89e76f6d-b92d-47b3-b48e-389e2a9574b6","Type":"ContainerDied","Data":"8900f5726d8127347505d45d9fcac6d20ece83b0c38c7eb307ba6d0534f00171"} Mar 18 16:47:23.056164 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.056117 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" event={"ID":"74b771c1-e823-4290-b421-6cb942a7ae44","Type":"ContainerStarted","Data":"583795eef5d290ffddab873adf201035aca9d18613b25c7f35d3367a6a76a7ab"} Mar 18 16:47:23.057978 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.057944 2570 generic.go:358] "Generic (PLEG): container finished" podID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerID="addf12400a1dd89d490535e770de1d278ed771d79f58dfb97279404a6c34692c" exitCode=0 Mar 18 16:47:23.058084 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.057996 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"addf12400a1dd89d490535e770de1d278ed771d79f58dfb97279404a6c34692c"} Mar 18 16:47:23.107348 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.107165 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2"] Mar 18 16:47:23.118979 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:23.118944 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ccf7ab2_3fd9_47f2_94af_3c2b3e77d0c9.slice/crio-00478c0db8d161a359467991a8ab95ea0fdd2b6c3bebcb9745be6e027c5995aa WatchSource:0}: Error finding container 00478c0db8d161a359467991a8ab95ea0fdd2b6c3bebcb9745be6e027c5995aa: Status 404 returned error can't find the container with id 00478c0db8d161a359467991a8ab95ea0fdd2b6c3bebcb9745be6e027c5995aa Mar 18 16:47:23.145561 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.145512 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-6df7999c47-fpf7x" podStartSLOduration=2.275784312 podStartE2EDuration="4.145496051s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.92892899 +0000 UTC m=+164.034580336" lastFinishedPulling="2026-03-18 16:47:21.798640725 +0000 UTC m=+165.904292075" observedRunningTime="2026-03-18 16:47:23.143322401 +0000 UTC m=+167.248973762" watchObservedRunningTime="2026-03-18 16:47:23.145496051 +0000 UTC m=+167.251147412" Mar 18 16:47:23.877951 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.877879 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r"] Mar 18 16:47:23.881672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.881644 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" Mar 18 16:47:23.885766 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.885469 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lvxpb\"" Mar 18 16:47:23.885923 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.885834 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Mar 18 16:47:23.900369 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.900319 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r"] Mar 18 16:47:23.955371 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.955334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f2e621ea-1527-4a57-af6a-4a71da84cf37-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-ghr2r\" (UID: \"f2e621ea-1527-4a57-af6a-4a71da84cf37\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" Mar 18 16:47:23.988449 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.988405 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-b75c4fc57-7wkh4"] Mar 18 16:47:23.992583 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.992555 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:23.996348 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.995839 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-xcnpd\"" Mar 18 16:47:23.996348 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.995842 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dtapluisdooab\"" Mar 18 16:47:23.996348 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.995843 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Mar 18 16:47:23.997062 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.996869 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Mar 18 16:47:23.997062 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.996970 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:47:23.998859 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:23.998802 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Mar 18 16:47:24.008291 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.008256 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b75c4fc57-7wkh4"] Mar 18 16:47:24.057004 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.056930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-secret-metrics-server-client-certs\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.057462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.057066 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/211a4dff-a127-4785-a371-3aec44fa3a84-metrics-server-audit-profiles\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.057462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.057115 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpqs\" (UniqueName: \"kubernetes.io/projected/211a4dff-a127-4785-a371-3aec44fa3a84-kube-api-access-5cpqs\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.057462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.057202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/211a4dff-a127-4785-a371-3aec44fa3a84-audit-log\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.057462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.057361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-secret-metrics-server-tls\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.057462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.057394 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211a4dff-a127-4785-a371-3aec44fa3a84-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.057462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.057427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f2e621ea-1527-4a57-af6a-4a71da84cf37-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-ghr2r\" (UID: \"f2e621ea-1527-4a57-af6a-4a71da84cf37\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" Mar 18 16:47:24.057462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.057462 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-client-ca-bundle\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.061146 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.061114 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f2e621ea-1527-4a57-af6a-4a71da84cf37-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-ghr2r\" (UID: \"f2e621ea-1527-4a57-af6a-4a71da84cf37\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" Mar 18 16:47:24.063669 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.063590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" event={"ID":"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9","Type":"ContainerStarted","Data":"00478c0db8d161a359467991a8ab95ea0fdd2b6c3bebcb9745be6e027c5995aa"} Mar 18 16:47:24.066782 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.066744 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r5scn" event={"ID":"89e76f6d-b92d-47b3-b48e-389e2a9574b6","Type":"ContainerStarted","Data":"a2cfc9614b99dbe59199ad662b2620db5b33f74fc23fffa37da91fc6c55b965c"} Mar 18 16:47:24.066937 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.066793 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r5scn" event={"ID":"89e76f6d-b92d-47b3-b48e-389e2a9574b6","Type":"ContainerStarted","Data":"604015c6c04e5ae2a62d50d8aa4ef01cd65d6946bdcb21d91825702728d2f293"} Mar 18 16:47:24.106634 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.106571 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-r5scn" podStartSLOduration=3.075364104 podStartE2EDuration="5.106545748s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.76532087 +0000 UTC m=+163.870972215" lastFinishedPulling="2026-03-18 16:47:21.796502516 +0000 UTC m=+165.902153859" observedRunningTime="2026-03-18 16:47:24.103925224 +0000 UTC m=+168.209576586" watchObservedRunningTime="2026-03-18 16:47:24.106545748 +0000 UTC m=+168.212197113" Mar 18 16:47:24.158570 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.158479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-secret-metrics-server-client-certs\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.158766 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.158636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/211a4dff-a127-4785-a371-3aec44fa3a84-metrics-server-audit-profiles\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.158766 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.158690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpqs\" (UniqueName: \"kubernetes.io/projected/211a4dff-a127-4785-a371-3aec44fa3a84-kube-api-access-5cpqs\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.158766 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.158737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/211a4dff-a127-4785-a371-3aec44fa3a84-audit-log\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.159142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.158887 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-secret-metrics-server-tls\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.159142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.158920 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211a4dff-a127-4785-a371-3aec44fa3a84-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.159142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.158973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-client-ca-bundle\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.159687 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.159649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/211a4dff-a127-4785-a371-3aec44fa3a84-audit-log\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.160090 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.160049 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211a4dff-a127-4785-a371-3aec44fa3a84-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.160566 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.160298 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/211a4dff-a127-4785-a371-3aec44fa3a84-metrics-server-audit-profiles\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.163661 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.163596 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-secret-metrics-server-tls\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.163850 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.163815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-client-ca-bundle\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.164589 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.164539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/211a4dff-a127-4785-a371-3aec44fa3a84-secret-metrics-server-client-certs\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.174459 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.174413 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpqs\" (UniqueName: \"kubernetes.io/projected/211a4dff-a127-4785-a371-3aec44fa3a84-kube-api-access-5cpqs\") pod \"metrics-server-b75c4fc57-7wkh4\" (UID: \"211a4dff-a127-4785-a371-3aec44fa3a84\") " pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.195812 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.195773 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" Mar 18 16:47:24.305995 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.305960 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:24.632997 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.632961 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-df7957d7c-svr72"] Mar 18 16:47:24.638412 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.637604 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.644207 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.644160 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Mar 18 16:47:24.645381 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.645355 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Mar 18 16:47:24.645605 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.645579 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Mar 18 16:47:24.645689 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.645607 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-9fhp6\"" Mar 18 16:47:24.645689 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.645465 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Mar 18 16:47:24.645817 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.645418 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Mar 18 16:47:24.649378 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.649345 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Mar 18 16:47:24.652246 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.652188 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-df7957d7c-svr72"] Mar 18 16:47:24.663853 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.663810 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-federate-client-tls\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.664048 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.663867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-serving-certs-ca-bundle\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.664048 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.663929 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-metrics-client-ca\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.664048 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.663958 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.664217 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.664084 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-telemeter-client-tls\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.664217 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.664127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-secret-telemeter-client\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.664217 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.664184 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.664370 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.664225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64bsg\" (UniqueName: \"kubernetes.io/projected/4a426097-9a72-428e-944c-7f658f3a6f6f-kube-api-access-64bsg\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765087 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765040 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-telemeter-client-tls\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765268 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765113 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-secret-telemeter-client\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765268 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765268 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64bsg\" (UniqueName: \"kubernetes.io/projected/4a426097-9a72-428e-944c-7f658f3a6f6f-kube-api-access-64bsg\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765268 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-federate-client-tls\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765488 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-serving-certs-ca-bundle\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765488 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765327 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-metrics-client-ca\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.765488 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.765354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.766279 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.766249 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-serving-certs-ca-bundle\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.766399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.766356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.766716 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.766668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a426097-9a72-428e-944c-7f658f3a6f6f-metrics-client-ca\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.768470 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.768445 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-telemeter-client-tls\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.768575 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.768500 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-secret-telemeter-client\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.769025 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.769005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-federate-client-tls\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.769140 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.769118 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a426097-9a72-428e-944c-7f658f3a6f6f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.775163 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.775132 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64bsg\" (UniqueName: \"kubernetes.io/projected/4a426097-9a72-428e-944c-7f658f3a6f6f-kube-api-access-64bsg\") pod \"telemeter-client-df7957d7c-svr72\" (UID: \"4a426097-9a72-428e-944c-7f658f3a6f6f\") " pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:24.954472 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:24.954425 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" Mar 18 16:47:25.066318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:25.066278 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r"] Mar 18 16:47:25.152345 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:25.152223 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b75c4fc57-7wkh4"] Mar 18 16:47:25.158142 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:25.158103 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211a4dff_a127_4785_a371_3aec44fa3a84.slice/crio-37837725f57435794544e79a3d214996c70df5a26bad3964903ac68cbfdbd180 WatchSource:0}: Error finding container 37837725f57435794544e79a3d214996c70df5a26bad3964903ac68cbfdbd180: Status 404 returned error can't find the container with id 37837725f57435794544e79a3d214996c70df5a26bad3964903ac68cbfdbd180 Mar 18 16:47:25.213243 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:25.213180 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-df7957d7c-svr72"] Mar 18 16:47:25.224922 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:25.224885 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a426097_9a72_428e_944c_7f658f3a6f6f.slice/crio-9663c9354edccd470d14579bf4f84e2797e0eda33691349eb20e86f482ccd231 WatchSource:0}: Error finding container 9663c9354edccd470d14579bf4f84e2797e0eda33691349eb20e86f482ccd231: Status 404 returned error can't find the container with id 9663c9354edccd470d14579bf4f84e2797e0eda33691349eb20e86f482ccd231 Mar 18 16:47:26.118475 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.118432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" event={"ID":"4a426097-9a72-428e-944c-7f658f3a6f6f","Type":"ContainerStarted","Data":"9663c9354edccd470d14579bf4f84e2797e0eda33691349eb20e86f482ccd231"} Mar 18 16:47:26.123025 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.122989 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerStarted","Data":"68e1fbba6c421f3b328c18e1f3c6e6f7cbbd2e726dbb7620bdbd49d8900d0725"} Mar 18 16:47:26.123208 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.123037 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerStarted","Data":"97e95d1910415018f03440e194a7f34566253495b5c791fc952ffb81afeb1a3f"} Mar 18 16:47:26.123208 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.123067 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerStarted","Data":"9840c9350a2667d4b6ed874e6d1418b74178fb005447e645020fb4281f9202bc"} Mar 18 16:47:26.123208 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.123080 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerStarted","Data":"ca107074dd6d57f78b26fad8b0a3041933439c9b5add0c8ad2434df71c0b1bf4"} Mar 18 16:47:26.123208 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.123093 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerStarted","Data":"c0fde5fab8335f3440dd7247974e693d4f422a7dc277ff08f229beee627235d5"} Mar 18 16:47:26.124511 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.124458 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" event={"ID":"f2e621ea-1527-4a57-af6a-4a71da84cf37","Type":"ContainerStarted","Data":"f77798816141b5737106b4c708455de7e800b54cc9a6c0e34dc906deac797600"} Mar 18 16:47:26.126079 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.126007 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" event={"ID":"211a4dff-a127-4785-a371-3aec44fa3a84","Type":"ContainerStarted","Data":"37837725f57435794544e79a3d214996c70df5a26bad3964903ac68cbfdbd180"} Mar 18 16:47:26.128463 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.128415 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" event={"ID":"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9","Type":"ContainerStarted","Data":"c9c9114d03ec8fee4e97dc589429b0cf93cf1447e6796d2dc922284dfb6a5c14"} Mar 18 16:47:26.128463 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.128453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" event={"ID":"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9","Type":"ContainerStarted","Data":"5268d6bc16f79ae3522e92d65a25e43d6400754c5e71fcea8a6fb4b0f2211e05"} Mar 18 16:47:26.128463 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.128467 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" event={"ID":"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9","Type":"ContainerStarted","Data":"fc84de4cb40cfab54d6a1f67d24e28ba2cae63e3bbb728e792400a6aaa934179"} Mar 18 16:47:26.191480 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.190918 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:26.199468 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.199423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.205050 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.204992 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:47:26.205050 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205002 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:47:26.205263 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205115 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:47:26.205263 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205190 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:47:26.205338 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205293 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:47:26.205338 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205302 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:47:26.205338 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205324 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:47:26.205436 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205411 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:47:26.206120 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205776 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:47:26.206120 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.205974 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:47:26.206587 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.206465 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:47:26.206587 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.206477 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v8xff\"" Mar 18 16:47:26.206952 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.206936 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9jiaq2go81fmn\"" Mar 18 16:47:26.208626 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.208606 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:47:26.211717 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.211676 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:26.282534 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282497 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282758 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282549 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282758 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-config\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282758 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282758 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282811 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282871 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282890 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282908 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-kube-api-access-hgwlt\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.282972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282941 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.283257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.282991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.283257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.283028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.283257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.283062 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.283257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.283078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.283257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.283103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.283257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.283125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.283257 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.283159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.383985 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.383896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.383985 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.383953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-config\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.383985 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.383986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384238 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384238 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384155 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384238 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384238 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384238 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384207 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384238 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384227 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-kube-api-access-hgwlt\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384293 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384330 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384387 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384481 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.384958 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.384900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.386597 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.385288 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.386597 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.385942 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.388660 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.388186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.389039 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.388819 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.389841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.389441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.389841 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.389485 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.391055 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.391021 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.391282 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.391244 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.394099 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.394062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.395192 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.395078 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-config\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.395192 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.395116 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.395192 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.395139 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.395415 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.395213 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.395464 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.395429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.395522 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.395464 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.396914 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.396865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.408030 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.407993 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-kube-api-access-hgwlt\") pod \"prometheus-k8s-0\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:26.513981 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:26.513944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:27.540672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:27.540638 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:28.136159 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.136125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" event={"ID":"211a4dff-a127-4785-a371-3aec44fa3a84","Type":"ContainerStarted","Data":"bf8b64952636a8bd47615ff143a59dff1d39a6888f62fcbbcb44b5c49a873761"} Mar 18 16:47:28.138779 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.138752 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" event={"ID":"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9","Type":"ContainerStarted","Data":"ba10ba1e77a8e67a682f686cb66927cf032afa0df26b934ae97a60c40f3ad514"} Mar 18 16:47:28.138961 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.138787 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" event={"ID":"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9","Type":"ContainerStarted","Data":"2ebc4f872b79ab0f3de573b162ca5beabf71fa0f49c101467279ac2e7641a050"} Mar 18 16:47:28.138961 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.138803 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" event={"ID":"0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9","Type":"ContainerStarted","Data":"c551e6d459c66277ddf7672f892cf20c45a129473b80c491b1e31af615edf7b0"} Mar 18 16:47:28.139081 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.138962 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:28.140731 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.140680 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" event={"ID":"4a426097-9a72-428e-944c-7f658f3a6f6f","Type":"ContainerStarted","Data":"18abf27561fd389c5dd1f691b62f45caad4e08d44910edf712b87005845138ea"} Mar 18 16:47:28.140862 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.140740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" event={"ID":"4a426097-9a72-428e-944c-7f658f3a6f6f","Type":"ContainerStarted","Data":"2d17509f534a0a4c3202e5cf8f43c96ae13af31e4a6d4cfec1388df4606edf08"} Mar 18 16:47:28.140862 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.140755 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" event={"ID":"4a426097-9a72-428e-944c-7f658f3a6f6f","Type":"ContainerStarted","Data":"cf9bfb7bad679b7e0ad1ac32cdb213f2da7444da469bd95e7881a8e5f7eb5b9b"} Mar 18 16:47:28.143791 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.143751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerStarted","Data":"4e3c207a86688e59747939acc008c439db42051ad60c9b2abfe98ebea6255bec"} Mar 18 16:47:28.145241 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.145216 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" event={"ID":"f2e621ea-1527-4a57-af6a-4a71da84cf37","Type":"ContainerStarted","Data":"0200336f8d6a85852a19ec897243c3be680613379a72e12ac1050d710737e193"} Mar 18 16:47:28.145379 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.145361 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" Mar 18 16:47:28.146870 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.146847 2570 generic.go:358] "Generic (PLEG): container finished" podID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" exitCode=0 Mar 18 16:47:28.146972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.146893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80"} Mar 18 16:47:28.146972 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.146914 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerStarted","Data":"8965939c3cad74c94917667f38166b86a55ade966b28d52b3e9b6c936b8c3e29"} Mar 18 16:47:28.150823 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.150804 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" Mar 18 16:47:28.158303 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.158251 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" podStartSLOduration=2.938237384 podStartE2EDuration="5.15823638s" podCreationTimestamp="2026-03-18 16:47:23 +0000 UTC" firstStartedPulling="2026-03-18 16:47:25.160231381 +0000 UTC m=+169.265882731" lastFinishedPulling="2026-03-18 16:47:27.380230384 +0000 UTC m=+171.485881727" observedRunningTime="2026-03-18 16:47:28.156731436 +0000 UTC m=+172.262382795" watchObservedRunningTime="2026-03-18 16:47:28.15823638 +0000 UTC m=+172.263887741" Mar 18 16:47:28.181545 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.181486 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" podStartSLOduration=1.9224213190000001 podStartE2EDuration="6.181457265s" podCreationTimestamp="2026-03-18 16:47:22 +0000 UTC" firstStartedPulling="2026-03-18 16:47:23.121186628 +0000 UTC m=+167.226837982" lastFinishedPulling="2026-03-18 16:47:27.380222575 +0000 UTC m=+171.485873928" observedRunningTime="2026-03-18 16:47:28.179910636 +0000 UTC m=+172.285562064" watchObservedRunningTime="2026-03-18 16:47:28.181457265 +0000 UTC m=+172.287108671" Mar 18 16:47:28.198637 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.198564 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-ghr2r" podStartSLOduration=2.912902408 podStartE2EDuration="5.198544723s" podCreationTimestamp="2026-03-18 16:47:23 +0000 UTC" firstStartedPulling="2026-03-18 16:47:25.094581691 +0000 UTC m=+169.200233033" lastFinishedPulling="2026-03-18 16:47:27.380224008 +0000 UTC m=+171.485875348" observedRunningTime="2026-03-18 16:47:28.197286804 +0000 UTC m=+172.302938167" watchObservedRunningTime="2026-03-18 16:47:28.198544723 +0000 UTC m=+172.304196085" Mar 18 16:47:28.225979 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.225916 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-df7957d7c-svr72" podStartSLOduration=2.010444166 podStartE2EDuration="4.22589703s" podCreationTimestamp="2026-03-18 16:47:24 +0000 UTC" firstStartedPulling="2026-03-18 16:47:25.230182793 +0000 UTC m=+169.335834133" lastFinishedPulling="2026-03-18 16:47:27.445635648 +0000 UTC m=+171.551286997" observedRunningTime="2026-03-18 16:47:28.224291259 +0000 UTC m=+172.329942621" watchObservedRunningTime="2026-03-18 16:47:28.22589703 +0000 UTC m=+172.331548394" Mar 18 16:47:28.282995 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.282923 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.095645205 podStartE2EDuration="8.282905523s" podCreationTimestamp="2026-03-18 16:47:20 +0000 UTC" firstStartedPulling="2026-03-18 16:47:21.192964942 +0000 UTC m=+165.298616282" lastFinishedPulling="2026-03-18 16:47:27.380225261 +0000 UTC m=+171.485876600" observedRunningTime="2026-03-18 16:47:28.280861926 +0000 UTC m=+172.386513286" watchObservedRunningTime="2026-03-18 16:47:28.282905523 +0000 UTC m=+172.388556910" Mar 18 16:47:28.514778 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.514653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:47:28.514778 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.514653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:47:28.517764 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.517694 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v9vx\"" Mar 18 16:47:28.525987 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.525948 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4vc72" Mar 18 16:47:28.660101 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:28.660064 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4vc72"] Mar 18 16:47:28.664374 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:47:28.664335 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod529555a4_f4da_4842_814f_1acffad52caf.slice/crio-93faac568617832c12ff767800a4c24b052713ad74c7061fd0c8d96fa0609559 WatchSource:0}: Error finding container 93faac568617832c12ff767800a4c24b052713ad74c7061fd0c8d96fa0609559: Status 404 returned error can't find the container with id 93faac568617832c12ff767800a4c24b052713ad74c7061fd0c8d96fa0609559 Mar 18 16:47:29.152740 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:29.152669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4vc72" event={"ID":"529555a4-f4da-4842-814f-1acffad52caf","Type":"ContainerStarted","Data":"93faac568617832c12ff767800a4c24b052713ad74c7061fd0c8d96fa0609559"} Mar 18 16:47:32.061203 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.061169 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t4psl" Mar 18 16:47:32.165241 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.165201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4vc72" event={"ID":"529555a4-f4da-4842-814f-1acffad52caf","Type":"ContainerStarted","Data":"fa3502a0e2f02be129e3e74acbc600f4ceada84783f903bd396e62a27d6f2bcc"} Mar 18 16:47:32.168397 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.168367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerStarted","Data":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} Mar 18 16:47:32.168397 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.168403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerStarted","Data":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} Mar 18 16:47:32.168598 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.168413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerStarted","Data":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} Mar 18 16:47:32.168598 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.168422 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerStarted","Data":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} Mar 18 16:47:32.168598 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.168430 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerStarted","Data":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} Mar 18 16:47:32.168598 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.168438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerStarted","Data":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} Mar 18 16:47:32.184154 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.184094 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4vc72" podStartSLOduration=139.531670653 podStartE2EDuration="2m22.18407532s" podCreationTimestamp="2026-03-18 16:45:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:28.666359267 +0000 UTC m=+172.772010606" lastFinishedPulling="2026-03-18 16:47:31.318763927 +0000 UTC m=+175.424415273" observedRunningTime="2026-03-18 16:47:32.182541436 +0000 UTC m=+176.288192798" watchObservedRunningTime="2026-03-18 16:47:32.18407532 +0000 UTC m=+176.289726681" Mar 18 16:47:32.211615 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:32.211550 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.036230329 podStartE2EDuration="6.211530177s" podCreationTimestamp="2026-03-18 16:47:26 +0000 UTC" firstStartedPulling="2026-03-18 16:47:28.148272825 +0000 UTC m=+172.253924179" lastFinishedPulling="2026-03-18 16:47:31.323572687 +0000 UTC m=+175.429224027" observedRunningTime="2026-03-18 16:47:32.211146919 +0000 UTC m=+176.316798281" watchObservedRunningTime="2026-03-18 16:47:32.211530177 +0000 UTC m=+176.317181540" Mar 18 16:47:34.160613 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:34.160579 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6c4d988b9b-7xpv2" Mar 18 16:47:34.514853 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:34.514770 2570 scope.go:117] "RemoveContainer" containerID="ebde252d0529a262dda015aa3bad71aeabb9e7380fcef3c16d730e3a9dd054b6" Mar 18 16:47:35.180114 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:35.180084 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:47:35.180510 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:35.180142 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" event={"ID":"0c78a76c-298c-468a-a6bd-98bc2950f67a","Type":"ContainerStarted","Data":"1c0109191ce892176680955feb6578ab489ac92d8cc61256382dd880392ad7f4"} Mar 18 16:47:35.180510 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:35.180417 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:47:35.199332 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:35.199277 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" podStartSLOduration=57.001336317 podStartE2EDuration="59.199260665s" podCreationTimestamp="2026-03-18 16:46:36 +0000 UTC" firstStartedPulling="2026-03-18 16:46:36.638476723 +0000 UTC m=+120.744128063" lastFinishedPulling="2026-03-18 16:46:38.836401072 +0000 UTC m=+122.942052411" observedRunningTime="2026-03-18 16:47:35.197674843 +0000 UTC m=+179.303326206" watchObservedRunningTime="2026-03-18 16:47:35.199260665 +0000 UTC m=+179.304912027" Mar 18 16:47:35.277399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:35.277365 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b8565867-gjg6w" Mar 18 16:47:36.514231 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:36.514195 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:44.306903 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:44.306855 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:44.306903 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:44.306901 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:47:50.229155 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:50.229122 2570 generic.go:358] "Generic (PLEG): container finished" podID="b81d0e21-b085-449c-848a-8150e032f670" containerID="2c59a468fb460012b45c5114ba0000f5f07394e2d24d06f4d884e34a0abe9da2" exitCode=0 Mar 18 16:47:50.229645 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:50.229197 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-p44bj" event={"ID":"b81d0e21-b085-449c-848a-8150e032f670","Type":"ContainerDied","Data":"2c59a468fb460012b45c5114ba0000f5f07394e2d24d06f4d884e34a0abe9da2"} Mar 18 16:47:50.229744 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:50.229647 2570 scope.go:117] "RemoveContainer" containerID="2c59a468fb460012b45c5114ba0000f5f07394e2d24d06f4d884e34a0abe9da2" Mar 18 16:47:51.234559 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:51.234484 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-p44bj" event={"ID":"b81d0e21-b085-449c-848a-8150e032f670","Type":"ContainerStarted","Data":"cfda21c71c7a583a54ea4f08d1bbbf3e8ed71a3291ac8c593ebb3487958d2145"} Mar 18 16:47:51.950888 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:51.950852 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23354495-d4ad-4d6d-a1e4-49d52c3791bb/init-config-reloader/0.log" Mar 18 16:47:52.150652 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:52.150619 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23354495-d4ad-4d6d-a1e4-49d52c3791bb/alertmanager/0.log" Mar 18 16:47:52.351188 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:52.351155 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23354495-d4ad-4d6d-a1e4-49d52c3791bb/config-reloader/0.log" Mar 18 16:47:52.550963 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:52.550937 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23354495-d4ad-4d6d-a1e4-49d52c3791bb/kube-rbac-proxy-web/0.log" Mar 18 16:47:52.750643 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:52.750550 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23354495-d4ad-4d6d-a1e4-49d52c3791bb/kube-rbac-proxy/0.log" Mar 18 16:47:52.949969 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:52.949937 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23354495-d4ad-4d6d-a1e4-49d52c3791bb/kube-rbac-proxy-metric/0.log" Mar 18 16:47:53.151817 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:53.151783 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23354495-d4ad-4d6d-a1e4-49d52c3791bb/prom-label-proxy/0.log" Mar 18 16:47:53.550795 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:53.550761 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-fpf7x_74b771c1-e823-4290-b421-6cb942a7ae44/kube-state-metrics/0.log" Mar 18 16:47:53.750290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:53.750251 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-fpf7x_74b771c1-e823-4290-b421-6cb942a7ae44/kube-rbac-proxy-main/0.log" Mar 18 16:47:53.950281 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:53.950201 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-fpf7x_74b771c1-e823-4290-b421-6cb942a7ae44/kube-rbac-proxy-self/0.log" Mar 18 16:47:54.151129 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:54.151102 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-b75c4fc57-7wkh4_211a4dff-a127-4785-a371-3aec44fa3a84/metrics-server/0.log" Mar 18 16:47:54.350632 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:54.350596 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6d47bdb78d-ghr2r_f2e621ea-1527-4a57-af6a-4a71da84cf37/monitoring-plugin/0.log" Mar 18 16:47:55.163870 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:55.163838 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r5scn_89e76f6d-b92d-47b3-b48e-389e2a9574b6/init-textfile/0.log" Mar 18 16:47:55.351836 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:55.351804 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r5scn_89e76f6d-b92d-47b3-b48e-389e2a9574b6/node-exporter/0.log" Mar 18 16:47:55.551567 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:55.551543 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r5scn_89e76f6d-b92d-47b3-b48e-389e2a9574b6/kube-rbac-proxy/0.log" Mar 18 16:47:56.350108 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:56.350081 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-jmc2v_1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8/kube-rbac-proxy-main/0.log" Mar 18 16:47:56.550484 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:56.550442 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-jmc2v_1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8/kube-rbac-proxy-self/0.log" Mar 18 16:47:56.751898 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:56.751807 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-jmc2v_1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8/openshift-state-metrics/0.log" Mar 18 16:47:56.951029 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:56.950995 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ccd9dca3-875d-4932-9227-5bb3620903e2/init-config-reloader/0.log" Mar 18 16:47:57.159754 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:57.159722 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ccd9dca3-875d-4932-9227-5bb3620903e2/prometheus/0.log" Mar 18 16:47:57.350991 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:57.350963 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ccd9dca3-875d-4932-9227-5bb3620903e2/config-reloader/0.log" Mar 18 16:47:57.551067 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:57.551030 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ccd9dca3-875d-4932-9227-5bb3620903e2/thanos-sidecar/0.log" Mar 18 16:47:57.750953 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:57.750920 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ccd9dca3-875d-4932-9227-5bb3620903e2/kube-rbac-proxy-web/0.log" Mar 18 16:47:57.950026 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:57.949938 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ccd9dca3-875d-4932-9227-5bb3620903e2/kube-rbac-proxy/0.log" Mar 18 16:47:58.150626 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:58.150590 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ccd9dca3-875d-4932-9227-5bb3620903e2/kube-rbac-proxy-thanos/0.log" Mar 18 16:47:58.750462 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:58.750435 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8444df798b-pksvx_7d73f442-384c-4654-89da-c1341a2fac11/prometheus-operator-admission-webhook/0.log" Mar 18 16:47:58.951235 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:58.951207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-df7957d7c-svr72_4a426097-9a72-428e-944c-7f658f3a6f6f/telemeter-client/0.log" Mar 18 16:47:59.149905 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:59.149873 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-df7957d7c-svr72_4a426097-9a72-428e-944c-7f658f3a6f6f/reload/0.log" Mar 18 16:47:59.350105 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:59.350075 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-df7957d7c-svr72_4a426097-9a72-428e-944c-7f658f3a6f6f/kube-rbac-proxy/0.log" Mar 18 16:47:59.550910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:59.550881 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/thanos-query/0.log" Mar 18 16:47:59.749469 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:59.749443 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy-web/0.log" Mar 18 16:47:59.954073 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:47:59.953992 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy/0.log" Mar 18 16:48:00.150220 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:00.150190 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/prom-label-proxy/0.log" Mar 18 16:48:00.350511 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:00.350473 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy-rules/0.log" Mar 18 16:48:00.550835 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:00.550808 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy-metrics/0.log" Mar 18 16:48:00.750941 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:00.750856 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-55b77584bb-tb98w_f9f34891-1c31-4c9e-9365-63ede3d6127d/networking-console-plugin/0.log" Mar 18 16:48:00.952074 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:00.952041 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:48:01.153978 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:01.153939 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/3.log" Mar 18 16:48:01.752220 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:01.752192 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bd4c46cc4-vslwk_7b285194-6029-4441-b4e2-56fdcc973573/router/0.log" Mar 18 16:48:01.950032 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:01.950001 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4vc72_529555a4-f4da-4842-814f-1acffad52caf/serve-healthcheck-canary/0.log" Mar 18 16:48:04.318375 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:04.318341 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:48:04.324659 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:04.324624 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-b75c4fc57-7wkh4" Mar 18 16:48:26.519588 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:26.519559 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:26.532259 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:26.532232 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:27.375382 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:27.375356 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:40.282512 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.282461 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:40.283148 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.283077 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="alertmanager" containerID="cri-o://68e1fbba6c421f3b328c18e1f3c6e6f7cbbd2e726dbb7620bdbd49d8900d0725" gracePeriod=120 Mar 18 16:48:40.283148 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.283109 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-metric" containerID="cri-o://97e95d1910415018f03440e194a7f34566253495b5c791fc952ffb81afeb1a3f" gracePeriod=120 Mar 18 16:48:40.283271 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.283137 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="prom-label-proxy" containerID="cri-o://4e3c207a86688e59747939acc008c439db42051ad60c9b2abfe98ebea6255bec" gracePeriod=120 Mar 18 16:48:40.283271 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.283175 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy" containerID="cri-o://9840c9350a2667d4b6ed874e6d1418b74178fb005447e645020fb4281f9202bc" gracePeriod=120 Mar 18 16:48:40.283271 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.283172 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="config-reloader" containerID="cri-o://c0fde5fab8335f3440dd7247974e693d4f422a7dc277ff08f229beee627235d5" gracePeriod=120 Mar 18 16:48:40.283408 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.283110 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-web" containerID="cri-o://ca107074dd6d57f78b26fad8b0a3041933439c9b5add0c8ad2434df71c0b1bf4" gracePeriod=120 Mar 18 16:48:40.404376 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404348 2570 generic.go:358] "Generic (PLEG): container finished" podID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerID="4e3c207a86688e59747939acc008c439db42051ad60c9b2abfe98ebea6255bec" exitCode=0 Mar 18 16:48:40.404376 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404373 2570 generic.go:358] "Generic (PLEG): container finished" podID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerID="9840c9350a2667d4b6ed874e6d1418b74178fb005447e645020fb4281f9202bc" exitCode=0 Mar 18 16:48:40.404376 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404380 2570 generic.go:358] "Generic (PLEG): container finished" podID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerID="c0fde5fab8335f3440dd7247974e693d4f422a7dc277ff08f229beee627235d5" exitCode=0 Mar 18 16:48:40.404603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404389 2570 generic.go:358] "Generic (PLEG): container finished" podID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerID="68e1fbba6c421f3b328c18e1f3c6e6f7cbbd2e726dbb7620bdbd49d8900d0725" exitCode=0 Mar 18 16:48:40.404603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404424 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"4e3c207a86688e59747939acc008c439db42051ad60c9b2abfe98ebea6255bec"} Mar 18 16:48:40.404603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404470 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"9840c9350a2667d4b6ed874e6d1418b74178fb005447e645020fb4281f9202bc"} Mar 18 16:48:40.404603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404484 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"c0fde5fab8335f3440dd7247974e693d4f422a7dc277ff08f229beee627235d5"} Mar 18 16:48:40.404603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:40.404493 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"68e1fbba6c421f3b328c18e1f3c6e6f7cbbd2e726dbb7620bdbd49d8900d0725"} Mar 18 16:48:41.411317 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.411287 2570 generic.go:358] "Generic (PLEG): container finished" podID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerID="97e95d1910415018f03440e194a7f34566253495b5c791fc952ffb81afeb1a3f" exitCode=0 Mar 18 16:48:41.411317 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.411311 2570 generic.go:358] "Generic (PLEG): container finished" podID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerID="ca107074dd6d57f78b26fad8b0a3041933439c9b5add0c8ad2434df71c0b1bf4" exitCode=0 Mar 18 16:48:41.411750 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.411356 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"97e95d1910415018f03440e194a7f34566253495b5c791fc952ffb81afeb1a3f"} Mar 18 16:48:41.411750 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.411391 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"ca107074dd6d57f78b26fad8b0a3041933439c9b5add0c8ad2434df71c0b1bf4"} Mar 18 16:48:41.549140 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.549114 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.606986 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbg9c\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-kube-api-access-bbg9c\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607043 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-tls-assets\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607113 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607148 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-cluster-tls-config\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607187 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-volume\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607221 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607250 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-out\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607286 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-main-tls\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607351 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-metrics-client-ca\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607386 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-main-db\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607413 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-trusted-ca-bundle\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607441 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-web\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.607490 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-web-config\") pod \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\" (UID: \"23354495-d4ad-4d6d-a1e4-49d52c3791bb\") " Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.610235 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:41.612774 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.610869 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:41.627900 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.627820 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.631426 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.631384 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-out" (OuterVolumeSpecName: "config-out") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:41.631878 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.631821 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:41.633019 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.632723 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.633019 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.632858 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-kube-api-access-bbg9c" (OuterVolumeSpecName: "kube-api-access-bbg9c") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "kube-api-access-bbg9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:41.633019 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.632929 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-volume" (OuterVolumeSpecName: "config-volume") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.633019 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.632983 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:41.633652 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.633621 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.634043 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.634013 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.639660 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.639621 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.645519 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.645486 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-web-config" (OuterVolumeSpecName: "web-config") pod "23354495-d4ad-4d6d-a1e4-49d52c3791bb" (UID: "23354495-d4ad-4d6d-a1e4-49d52c3791bb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.709142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709090 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-web-config\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709136 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbg9c\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-kube-api-access-bbg9c\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709147 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23354495-d4ad-4d6d-a1e4-49d52c3791bb-tls-assets\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709142 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709160 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709169 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-cluster-tls-config\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709179 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-volume\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709188 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709198 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-config-out\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709206 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-main-tls\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709215 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-metrics-client-ca\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709223 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-main-db\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709232 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23354495-d4ad-4d6d-a1e4-49d52c3791bb-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.709422 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:41.709240 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23354495-d4ad-4d6d-a1e4-49d52c3791bb-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:42.417471 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.417435 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23354495-d4ad-4d6d-a1e4-49d52c3791bb","Type":"ContainerDied","Data":"a0a5b6ae27782e4a04c885c10b86e815d62736e245c29ed11c7abea30648cc98"} Mar 18 16:48:42.417887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.417475 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.417887 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.417486 2570 scope.go:117] "RemoveContainer" containerID="4e3c207a86688e59747939acc008c439db42051ad60c9b2abfe98ebea6255bec" Mar 18 16:48:42.427614 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.427589 2570 scope.go:117] "RemoveContainer" containerID="97e95d1910415018f03440e194a7f34566253495b5c791fc952ffb81afeb1a3f" Mar 18 16:48:42.435964 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.435940 2570 scope.go:117] "RemoveContainer" containerID="9840c9350a2667d4b6ed874e6d1418b74178fb005447e645020fb4281f9202bc" Mar 18 16:48:42.443393 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.443355 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:42.445885 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.445860 2570 scope.go:117] "RemoveContainer" containerID="ca107074dd6d57f78b26fad8b0a3041933439c9b5add0c8ad2434df71c0b1bf4" Mar 18 16:48:42.447720 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.447623 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:42.454603 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.454585 2570 scope.go:117] "RemoveContainer" containerID="c0fde5fab8335f3440dd7247974e693d4f422a7dc277ff08f229beee627235d5" Mar 18 16:48:42.462940 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.462916 2570 scope.go:117] "RemoveContainer" containerID="68e1fbba6c421f3b328c18e1f3c6e6f7cbbd2e726dbb7620bdbd49d8900d0725" Mar 18 16:48:42.471561 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.471535 2570 scope.go:117] "RemoveContainer" containerID="addf12400a1dd89d490535e770de1d278ed771d79f58dfb97279404a6c34692c" Mar 18 16:48:42.474809 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.474780 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:42.475190 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475176 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="prom-label-proxy" Mar 18 16:48:42.475248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475192 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="prom-label-proxy" Mar 18 16:48:42.475248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475201 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="config-reloader" Mar 18 16:48:42.475248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475210 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="config-reloader" Mar 18 16:48:42.475248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475221 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="init-config-reloader" Mar 18 16:48:42.475248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475227 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="init-config-reloader" Mar 18 16:48:42.475248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475245 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="alertmanager" Mar 18 16:48:42.475248 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475250 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="alertmanager" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475258 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-web" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475264 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-web" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475272 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475278 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475285 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-metric" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475295 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-metric" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475345 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="prom-label-proxy" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475354 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="alertmanager" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475364 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-web" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475373 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy-metric" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475380 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="kube-rbac-proxy" Mar 18 16:48:42.475454 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.475387 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" containerName="config-reloader" Mar 18 16:48:42.486276 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.486247 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.489154 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489118 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:48:42.489154 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489118 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:48:42.489361 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489170 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:48:42.489361 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489130 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:48:42.489524 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489509 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:48:42.489569 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489507 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:48:42.489569 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489510 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:48:42.489638 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489565 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-g7wzf\"" Mar 18 16:48:42.489638 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.489538 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:48:42.493430 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.493400 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:42.495577 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.495548 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:48:42.516427 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516392 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516427 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516463 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fzn\" (UniqueName: \"kubernetes.io/projected/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-kube-api-access-n7fzn\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516581 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516598 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516619 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-config-volume\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516662 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-web-config\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.516745 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-config-out\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.517034 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516773 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.517034 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516799 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.517034 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.516826 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.521733 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.521681 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23354495-d4ad-4d6d-a1e4-49d52c3791bb" path="/var/lib/kubelet/pods/23354495-d4ad-4d6d-a1e4-49d52c3791bb/volumes" Mar 18 16:48:42.617993 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.617953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-config-out\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618160 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618160 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618044 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618160 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618160 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618160 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618117 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618160 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618442 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7fzn\" (UniqueName: \"kubernetes.io/projected/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-kube-api-access-n7fzn\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618442 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.618575 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.618541 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.619551 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.619031 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.619551 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.619101 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.619551 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.619146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-config-volume\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.619551 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.619195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-web-config\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.619874 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.619749 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.619936 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.619911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.621443 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.621415 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.621620 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.621596 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.622198 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.622091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-config-out\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.622198 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.622161 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.622352 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.622325 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.622431 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.622410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.622484 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.622410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-config-volume\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.622484 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.622474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.623368 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.623349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-web-config\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.628137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.628104 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7fzn\" (UniqueName: \"kubernetes.io/projected/4f3b35b3-c3ae-478c-92b4-16e87ffd743e-kube-api-access-n7fzn\") pod \"alertmanager-main-0\" (UID: \"4f3b35b3-c3ae-478c-92b4-16e87ffd743e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.797858 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.797822 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:42.943416 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:42.943368 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:42.947355 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:48:42.947327 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3b35b3_c3ae_478c_92b4_16e87ffd743e.slice/crio-3b12550787d358ce0be0afdfc4cb875dfb0c0853a0c796fa2053ab977fba1040 WatchSource:0}: Error finding container 3b12550787d358ce0be0afdfc4cb875dfb0c0853a0c796fa2053ab977fba1040: Status 404 returned error can't find the container with id 3b12550787d358ce0be0afdfc4cb875dfb0c0853a0c796fa2053ab977fba1040 Mar 18 16:48:43.422666 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:43.422632 2570 generic.go:358] "Generic (PLEG): container finished" podID="4f3b35b3-c3ae-478c-92b4-16e87ffd743e" containerID="ab017944e31fddf30889a7fd909816bcc0bfcf63092726a6dee2e3a7e8223680" exitCode=0 Mar 18 16:48:43.423082 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:43.422718 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerDied","Data":"ab017944e31fddf30889a7fd909816bcc0bfcf63092726a6dee2e3a7e8223680"} Mar 18 16:48:43.423082 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:43.422743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerStarted","Data":"3b12550787d358ce0be0afdfc4cb875dfb0c0853a0c796fa2053ab977fba1040"} Mar 18 16:48:44.428919 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.428884 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerStarted","Data":"fe9fcfac556b2efea14a36288fe43a484fac8007094699560dc64d6f5195bbbf"} Mar 18 16:48:44.428919 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.428923 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerStarted","Data":"66a719b53da383124dc479f3fd092152e29a0e2534386dfcf3a603d0f1223ef8"} Mar 18 16:48:44.429325 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.428934 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerStarted","Data":"ab43d19fea90d3b2491210014cd93889252666191ebf7ab1f08b8c1b90112793"} Mar 18 16:48:44.429325 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.428944 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerStarted","Data":"7071bb59820512d2bc5e2a5cbec654e3b50abb8687fbb69404dbbeed0fffba6c"} Mar 18 16:48:44.429325 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.428953 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerStarted","Data":"f5539872cb4a0b7d8263a91f2259d2750952a94c120e3f10811dd3a4ea794e62"} Mar 18 16:48:44.429325 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.428962 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4f3b35b3-c3ae-478c-92b4-16e87ffd743e","Type":"ContainerStarted","Data":"0946788fd07525bfc32bfaa98f5cdf8fccd671a24458357aacef9aa5577264b4"} Mar 18 16:48:44.465629 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.465572 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.465552942 podStartE2EDuration="2.465552942s" podCreationTimestamp="2026-03-18 16:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:44.463799704 +0000 UTC m=+248.569451067" watchObservedRunningTime="2026-03-18 16:48:44.465552942 +0000 UTC m=+248.571204308" Mar 18 16:48:44.544339 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.544303 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:44.544826 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.544762 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-thanos" containerID="cri-o://5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" gracePeriod=600 Mar 18 16:48:44.544826 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.544796 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="config-reloader" containerID="cri-o://b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" gracePeriod=600 Mar 18 16:48:44.544826 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.544802 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy" containerID="cri-o://32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" gracePeriod=600 Mar 18 16:48:44.545037 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.544801 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-web" containerID="cri-o://14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" gracePeriod=600 Mar 18 16:48:44.545037 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.544834 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="thanos-sidecar" containerID="cri-o://443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" gracePeriod=600 Mar 18 16:48:44.545037 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.544766 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="prometheus" containerID="cri-o://11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" gracePeriod=600 Mar 18 16:48:44.802604 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.802580 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:44.839272 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839234 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-db\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839444 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839307 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-kube-rbac-proxy\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839444 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839338 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-web-config\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839444 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839425 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-rulefiles-0\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839606 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839465 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-metrics-client-ca\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839606 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839492 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-config\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839606 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839523 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839606 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839552 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-metrics-client-certs\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839606 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839587 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-trusted-ca-bundle\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839621 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-serving-certs-ca-bundle\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839652 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-kube-api-access-hgwlt\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839731 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-config-out\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839787 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-tls-assets\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839813 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839839 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-grpc-tls\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.839881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839876 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-thanos-prometheus-http-client-file\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.840280 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839917 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-tls\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.840280 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.839976 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-kubelet-serving-ca-bundle\") pod \"ccd9dca3-875d-4932-9227-5bb3620903e2\" (UID: \"ccd9dca3-875d-4932-9227-5bb3620903e2\") " Mar 18 16:48:44.840280 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.840173 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:44.840460 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.840438 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.840640 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.840476 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:44.841102 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.840960 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:44.841451 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.841419 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:44.841854 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.841802 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:44.843950 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.843805 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:44.845959 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.844494 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.846459 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.846298 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.846459 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.846371 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.846667 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.846557 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.847072 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.847027 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.847687 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.847650 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-kube-api-access-hgwlt" (OuterVolumeSpecName: "kube-api-access-hgwlt") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "kube-api-access-hgwlt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:44.847687 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.847650 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.847881 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.847796 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-config-out" (OuterVolumeSpecName: "config-out") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:44.848029 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.847993 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.848356 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.848319 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-config" (OuterVolumeSpecName: "config") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.849390 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.849357 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:44.860214 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.860175 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-web-config" (OuterVolumeSpecName: "web-config") pod "ccd9dca3-875d-4932-9227-5bb3620903e2" (UID: "ccd9dca3-875d-4932-9227-5bb3620903e2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:44.941518 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941477 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941518 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941510 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-metrics-client-ca\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941518 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941521 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-config\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941534 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941546 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-metrics-client-certs\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941556 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941568 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-kube-api-access-hgwlt\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941577 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-config-out\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941587 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccd9dca3-875d-4932-9227-5bb3620903e2-tls-assets\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941596 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941604 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-grpc-tls\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941613 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941622 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941631 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd9dca3-875d-4932-9227-5bb3620903e2-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941639 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ccd9dca3-875d-4932-9227-5bb3620903e2-prometheus-k8s-db\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941648 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-secret-kube-rbac-proxy\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:44.941829 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:44.941656 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccd9dca3-875d-4932-9227-5bb3620903e2-web-config\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:48:45.436224 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436192 2570 generic.go:358] "Generic (PLEG): container finished" podID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" exitCode=0 Mar 18 16:48:45.436224 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436217 2570 generic.go:358] "Generic (PLEG): container finished" podID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" exitCode=0 Mar 18 16:48:45.436224 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436223 2570 generic.go:358] "Generic (PLEG): container finished" podID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" exitCode=0 Mar 18 16:48:45.436224 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436229 2570 generic.go:358] "Generic (PLEG): container finished" podID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" exitCode=0 Mar 18 16:48:45.436224 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436234 2570 generic.go:358] "Generic (PLEG): container finished" podID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" exitCode=0 Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436240 2570 generic.go:358] "Generic (PLEG): container finished" podID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" exitCode=0 Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436277 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436328 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436350 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436292 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436370 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436387 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436396 2570 scope.go:117] "RemoveContainer" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436402 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} Mar 18 16:48:45.436948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.436493 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ccd9dca3-875d-4932-9227-5bb3620903e2","Type":"ContainerDied","Data":"8965939c3cad74c94917667f38166b86a55ade966b28d52b3e9b6c936b8c3e29"} Mar 18 16:48:45.446189 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.446169 2570 scope.go:117] "RemoveContainer" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.455160 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.455133 2570 scope.go:117] "RemoveContainer" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.462550 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.462519 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:45.463814 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.463789 2570 scope.go:117] "RemoveContainer" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.467211 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.467184 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:45.472668 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.472643 2570 scope.go:117] "RemoveContainer" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.480943 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.480919 2570 scope.go:117] "RemoveContainer" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.489991 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.489966 2570 scope.go:117] "RemoveContainer" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.495392 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495363 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:45.495743 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495729 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="init-config-reloader" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495745 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="init-config-reloader" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495759 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-web" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495765 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-web" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495774 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="config-reloader" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495779 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="config-reloader" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495787 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="prometheus" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495792 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="prometheus" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495802 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy" Mar 18 16:48:45.495811 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495809 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495817 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="thanos-sidecar" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495822 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="thanos-sidecar" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495832 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-thanos" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495838 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-thanos" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495890 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="config-reloader" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495896 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495902 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-thanos" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495912 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="kube-rbac-proxy-web" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495922 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="thanos-sidecar" Mar 18 16:48:45.496318 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.495929 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" containerName="prometheus" Mar 18 16:48:45.499090 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.499072 2570 scope.go:117] "RemoveContainer" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.499442 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:48:45.499421 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": container with ID starting with 5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802 not found: ID does not exist" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.499509 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.499457 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} err="failed to get container status \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": rpc error: code = NotFound desc = could not find container \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": container with ID starting with 5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802 not found: ID does not exist" Mar 18 16:48:45.499591 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.499504 2570 scope.go:117] "RemoveContainer" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.499827 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:48:45.499810 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": container with ID starting with 32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971 not found: ID does not exist" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.499886 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.499833 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} err="failed to get container status \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": rpc error: code = NotFound desc = could not find container \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": container with ID starting with 32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971 not found: ID does not exist" Mar 18 16:48:45.499886 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.499849 2570 scope.go:117] "RemoveContainer" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.500062 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:48:45.500045 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": container with ID starting with 14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c not found: ID does not exist" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.500104 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500063 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} err="failed to get container status \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": rpc error: code = NotFound desc = could not find container \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": container with ID starting with 14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c not found: ID does not exist" Mar 18 16:48:45.500104 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500078 2570 scope.go:117] "RemoveContainer" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.500324 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:48:45.500307 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": container with ID starting with 443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19 not found: ID does not exist" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.500361 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500331 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} err="failed to get container status \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": rpc error: code = NotFound desc = could not find container \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": container with ID starting with 443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19 not found: ID does not exist" Mar 18 16:48:45.500361 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500345 2570 scope.go:117] "RemoveContainer" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.500600 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:48:45.500579 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": container with ID starting with b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc not found: ID does not exist" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.500600 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500598 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} err="failed to get container status \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": rpc error: code = NotFound desc = could not find container \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": container with ID starting with b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc not found: ID does not exist" Mar 18 16:48:45.500788 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500610 2570 scope.go:117] "RemoveContainer" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.500956 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:48:45.500879 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": container with ID starting with 11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086 not found: ID does not exist" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.500956 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500904 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} err="failed to get container status \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": rpc error: code = NotFound desc = could not find container \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": container with ID starting with 11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086 not found: ID does not exist" Mar 18 16:48:45.500956 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.500928 2570 scope.go:117] "RemoveContainer" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.501195 ip-10-0-129-201 kubenswrapper[2570]: E0318 16:48:45.501169 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": container with ID starting with db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80 not found: ID does not exist" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.501297 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.501201 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80"} err="failed to get container status \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": rpc error: code = NotFound desc = could not find container \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": container with ID starting with db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80 not found: ID does not exist" Mar 18 16:48:45.501297 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.501225 2570 scope.go:117] "RemoveContainer" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.501436 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.501416 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.501860 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.501599 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} err="failed to get container status \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": rpc error: code = NotFound desc = could not find container \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": container with ID starting with 5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802 not found: ID does not exist" Mar 18 16:48:45.501860 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.501626 2570 scope.go:117] "RemoveContainer" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.502095 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.501881 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} err="failed to get container status \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": rpc error: code = NotFound desc = could not find container \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": container with ID starting with 32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971 not found: ID does not exist" Mar 18 16:48:45.502095 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.501907 2570 scope.go:117] "RemoveContainer" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.502205 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502188 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} err="failed to get container status \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": rpc error: code = NotFound desc = could not find container \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": container with ID starting with 14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c not found: ID does not exist" Mar 18 16:48:45.502244 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502207 2570 scope.go:117] "RemoveContainer" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.502478 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502453 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} err="failed to get container status \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": rpc error: code = NotFound desc = could not find container \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": container with ID starting with 443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19 not found: ID does not exist" Mar 18 16:48:45.502533 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502478 2570 scope.go:117] "RemoveContainer" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.502734 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502691 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} err="failed to get container status \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": rpc error: code = NotFound desc = could not find container \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": container with ID starting with b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc not found: ID does not exist" Mar 18 16:48:45.502775 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502734 2570 scope.go:117] "RemoveContainer" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.502962 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502943 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} err="failed to get container status \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": rpc error: code = NotFound desc = could not find container \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": container with ID starting with 11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086 not found: ID does not exist" Mar 18 16:48:45.503009 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.502965 2570 scope.go:117] "RemoveContainer" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.503196 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.503177 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80"} err="failed to get container status \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": rpc error: code = NotFound desc = could not find container \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": container with ID starting with db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80 not found: ID does not exist" Mar 18 16:48:45.503259 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.503196 2570 scope.go:117] "RemoveContainer" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.503426 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.503408 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} err="failed to get container status \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": rpc error: code = NotFound desc = could not find container \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": container with ID starting with 5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802 not found: ID does not exist" Mar 18 16:48:45.503467 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.503426 2570 scope.go:117] "RemoveContainer" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.503657 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.503636 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} err="failed to get container status \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": rpc error: code = NotFound desc = could not find container \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": container with ID starting with 32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971 not found: ID does not exist" Mar 18 16:48:45.503720 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.503658 2570 scope.go:117] "RemoveContainer" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.504289 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.504051 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:48:45.504289 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.504131 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v8xff\"" Mar 18 16:48:45.504289 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.504147 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:48:45.504464 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.504405 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:48:45.504611 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.504593 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:48:45.504837 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.504809 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9jiaq2go81fmn\"" Mar 18 16:48:45.505044 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505020 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} err="failed to get container status \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": rpc error: code = NotFound desc = could not find container \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": container with ID starting with 14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c not found: ID does not exist" Mar 18 16:48:45.505122 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505049 2570 scope.go:117] "RemoveContainer" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.505364 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505341 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} err="failed to get container status \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": rpc error: code = NotFound desc = could not find container \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": container with ID starting with 443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19 not found: ID does not exist" Mar 18 16:48:45.505440 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505366 2570 scope.go:117] "RemoveContainer" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.505578 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505562 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:48:45.505578 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505573 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:48:45.507109 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505857 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} err="failed to get container status \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": rpc error: code = NotFound desc = could not find container \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": container with ID starting with b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc not found: ID does not exist" Mar 18 16:48:45.507109 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505887 2570 scope.go:117] "RemoveContainer" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.507109 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.505913 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:48:45.507109 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.506411 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} err="failed to get container status \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": rpc error: code = NotFound desc = could not find container \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": container with ID starting with 11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086 not found: ID does not exist" Mar 18 16:48:45.507109 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.506435 2570 scope.go:117] "RemoveContainer" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.507777 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.507590 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:48:45.508315 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.508056 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:48:45.508315 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.508258 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:48:45.508315 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.508248 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80"} err="failed to get container status \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": rpc error: code = NotFound desc = could not find container \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": container with ID starting with db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80 not found: ID does not exist" Mar 18 16:48:45.508315 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.508284 2570 scope.go:117] "RemoveContainer" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.508923 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.508877 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} err="failed to get container status \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": rpc error: code = NotFound desc = could not find container \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": container with ID starting with 5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802 not found: ID does not exist" Mar 18 16:48:45.509009 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.508925 2570 scope.go:117] "RemoveContainer" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.509124 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.509102 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:48:45.509326 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.509304 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} err="failed to get container status \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": rpc error: code = NotFound desc = could not find container \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": container with ID starting with 32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971 not found: ID does not exist" Mar 18 16:48:45.509399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.509327 2570 scope.go:117] "RemoveContainer" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.509621 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.509600 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} err="failed to get container status \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": rpc error: code = NotFound desc = could not find container \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": container with ID starting with 14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c not found: ID does not exist" Mar 18 16:48:45.509678 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.509622 2570 scope.go:117] "RemoveContainer" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.509910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.509887 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} err="failed to get container status \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": rpc error: code = NotFound desc = could not find container \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": container with ID starting with 443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19 not found: ID does not exist" Mar 18 16:48:45.509910 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.509908 2570 scope.go:117] "RemoveContainer" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.510276 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.510186 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} err="failed to get container status \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": rpc error: code = NotFound desc = could not find container \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": container with ID starting with b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc not found: ID does not exist" Mar 18 16:48:45.510276 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.510211 2570 scope.go:117] "RemoveContainer" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.510553 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.510507 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} err="failed to get container status \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": rpc error: code = NotFound desc = could not find container \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": container with ID starting with 11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086 not found: ID does not exist" Mar 18 16:48:45.510553 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.510532 2570 scope.go:117] "RemoveContainer" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.512187 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.510817 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80"} err="failed to get container status \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": rpc error: code = NotFound desc = could not find container \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": container with ID starting with db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80 not found: ID does not exist" Mar 18 16:48:45.512187 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.510838 2570 scope.go:117] "RemoveContainer" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.512187 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.511120 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} err="failed to get container status \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": rpc error: code = NotFound desc = could not find container \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": container with ID starting with 5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802 not found: ID does not exist" Mar 18 16:48:45.512187 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.511149 2570 scope.go:117] "RemoveContainer" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.512380 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.512674 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} err="failed to get container status \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": rpc error: code = NotFound desc = could not find container \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": container with ID starting with 32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971 not found: ID does not exist" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.512720 2570 scope.go:117] "RemoveContainer" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.512999 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} err="failed to get container status \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": rpc error: code = NotFound desc = could not find container \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": container with ID starting with 14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c not found: ID does not exist" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513023 2570 scope.go:117] "RemoveContainer" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513306 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} err="failed to get container status \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": rpc error: code = NotFound desc = could not find container \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": container with ID starting with 443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19 not found: ID does not exist" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513323 2570 scope.go:117] "RemoveContainer" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513566 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} err="failed to get container status \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": rpc error: code = NotFound desc = could not find container \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": container with ID starting with b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc not found: ID does not exist" Mar 18 16:48:45.513672 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513584 2570 scope.go:117] "RemoveContainer" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.514097 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513797 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} err="failed to get container status \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": rpc error: code = NotFound desc = could not find container \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": container with ID starting with 11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086 not found: ID does not exist" Mar 18 16:48:45.514097 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513810 2570 scope.go:117] "RemoveContainer" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.514097 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513978 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80"} err="failed to get container status \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": rpc error: code = NotFound desc = could not find container \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": container with ID starting with db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80 not found: ID does not exist" Mar 18 16:48:45.514097 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.513992 2570 scope.go:117] "RemoveContainer" containerID="5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802" Mar 18 16:48:45.514277 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.514159 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802"} err="failed to get container status \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": rpc error: code = NotFound desc = could not find container \"5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802\": container with ID starting with 5721bfc8f3b74d337556a97e87ce9b7d1b7d338f2f99850edf1646ff3d61c802 not found: ID does not exist" Mar 18 16:48:45.514277 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.514174 2570 scope.go:117] "RemoveContainer" containerID="32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971" Mar 18 16:48:45.514372 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.514330 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:45.514474 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.514448 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971"} err="failed to get container status \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": rpc error: code = NotFound desc = could not find container \"32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971\": container with ID starting with 32feea89ad9281e698f86c24830367879f3b7be1b7f7fbb9d833aaa40da47971 not found: ID does not exist" Mar 18 16:48:45.514585 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.514475 2570 scope.go:117] "RemoveContainer" containerID="14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c" Mar 18 16:48:45.514801 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.514777 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c"} err="failed to get container status \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": rpc error: code = NotFound desc = could not find container \"14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c\": container with ID starting with 14a4284adba7c983dff2e2066b6bead389b3d7720515841b2c3508d69917a18c not found: ID does not exist" Mar 18 16:48:45.514870 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.514805 2570 scope.go:117] "RemoveContainer" containerID="443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19" Mar 18 16:48:45.515079 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.515056 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19"} err="failed to get container status \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": rpc error: code = NotFound desc = could not find container \"443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19\": container with ID starting with 443e98cec9260a59a0c0b7c5551a5b40fdce6dac86e7090275c63c5f7691be19 not found: ID does not exist" Mar 18 16:48:45.515180 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.515078 2570 scope.go:117] "RemoveContainer" containerID="b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc" Mar 18 16:48:45.515308 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.515292 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc"} err="failed to get container status \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": rpc error: code = NotFound desc = could not find container \"b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc\": container with ID starting with b5ff260b6f459a6dbc97bcf9761ff620c7d3c1cae5441ba82d83ca75397e5dbc not found: ID does not exist" Mar 18 16:48:45.515366 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.515311 2570 scope.go:117] "RemoveContainer" containerID="11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086" Mar 18 16:48:45.515546 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.515522 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086"} err="failed to get container status \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": rpc error: code = NotFound desc = could not find container \"11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086\": container with ID starting with 11698c0350510f678d2d54a81d57574eb8635bb73203fd438562d3af6f6cd086 not found: ID does not exist" Mar 18 16:48:45.515617 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.515551 2570 scope.go:117] "RemoveContainer" containerID="db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80" Mar 18 16:48:45.515849 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.515823 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80"} err="failed to get container status \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": rpc error: code = NotFound desc = could not find container \"db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80\": container with ID starting with db8b761a978c18f160c9ee177acdbc1c9d3228733b300392130886738f758a80 not found: ID does not exist" Mar 18 16:48:45.550995 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.550937 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.550995 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.550988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551258 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551258 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551245 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551345 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551345 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551306 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wwjg\" (UniqueName: \"kubernetes.io/projected/207ae0e1-58da-43d5-a7e4-475bb668e042-kube-api-access-4wwjg\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551345 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551333 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-config\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551531 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551592 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551567 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/207ae0e1-58da-43d5-a7e4-475bb668e042-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551731 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551679 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551812 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/207ae0e1-58da-43d5-a7e4-475bb668e042-config-out\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551880 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-web-config\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551930 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.551978 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.551963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.552048 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.552033 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.552107 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.552089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.552159 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.552143 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.552223 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.552208 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.652948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.652905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.652948 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.652945 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653151 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.652966 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653151 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.652994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653151 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wwjg\" (UniqueName: \"kubernetes.io/projected/207ae0e1-58da-43d5-a7e4-475bb668e042-kube-api-access-4wwjg\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-config\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653238 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653290 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/207ae0e1-58da-43d5-a7e4-475bb668e042-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653305 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/207ae0e1-58da-43d5-a7e4-475bb668e042-config-out\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-web-config\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653400 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653743 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653509 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653743 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653743 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653575 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653743 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.653968 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.653825 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.655270 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.654113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.655270 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.654928 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.656578 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.656543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.656724 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.656631 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.657028 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.657005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.657103 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.657034 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-web-config\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.657103 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.657087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.657492 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.657417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.658301 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.657693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.658301 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.657768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.658301 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.657913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.658301 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.658258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.659631 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.659610 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/207ae0e1-58da-43d5-a7e4-475bb668e042-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.659949 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.659923 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/207ae0e1-58da-43d5-a7e4-475bb668e042-config\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.660044 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.659994 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/207ae0e1-58da-43d5-a7e4-475bb668e042-config-out\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.660991 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.660967 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/207ae0e1-58da-43d5-a7e4-475bb668e042-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.662513 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.662492 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wwjg\" (UniqueName: \"kubernetes.io/projected/207ae0e1-58da-43d5-a7e4-475bb668e042-kube-api-access-4wwjg\") pod \"prometheus-k8s-0\" (UID: \"207ae0e1-58da-43d5-a7e4-475bb668e042\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.815729 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.815660 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:45.960222 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:45.960094 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:45.962960 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:48:45.962932 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod207ae0e1_58da_43d5_a7e4_475bb668e042.slice/crio-ab3c80a3091551d4d8b2c561a2b12818a731f40076067a44d3d911ab7b4b15fb WatchSource:0}: Error finding container ab3c80a3091551d4d8b2c561a2b12818a731f40076067a44d3d911ab7b4b15fb: Status 404 returned error can't find the container with id ab3c80a3091551d4d8b2c561a2b12818a731f40076067a44d3d911ab7b4b15fb Mar 18 16:48:46.442003 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:46.441970 2570 generic.go:358] "Generic (PLEG): container finished" podID="207ae0e1-58da-43d5-a7e4-475bb668e042" containerID="0cf31c5e8067216261974361027d957aa16974e77f0435bc0a8401855436f6bc" exitCode=0 Mar 18 16:48:46.442552 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:46.442027 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerDied","Data":"0cf31c5e8067216261974361027d957aa16974e77f0435bc0a8401855436f6bc"} Mar 18 16:48:46.442552 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:46.442045 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerStarted","Data":"ab3c80a3091551d4d8b2c561a2b12818a731f40076067a44d3d911ab7b4b15fb"} Mar 18 16:48:46.520636 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:46.520595 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd9dca3-875d-4932-9227-5bb3620903e2" path="/var/lib/kubelet/pods/ccd9dca3-875d-4932-9227-5bb3620903e2/volumes" Mar 18 16:48:47.448897 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:47.448863 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerStarted","Data":"23687f5a68f6f86d16ad3492bc93d6c62e9a9def62bbfd69651f93b88438877e"} Mar 18 16:48:47.448897 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:47.448903 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerStarted","Data":"5a904d9f0484bd0067e1bb940411dd67ab211df7728c496e654569b3b8fcd5db"} Mar 18 16:48:47.449332 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:47.448915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerStarted","Data":"d76e7bf5c64a1aae68242eccdbc0482376b3571572f54e3a7f0eebb1a2cc7cb0"} Mar 18 16:48:47.449332 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:47.448925 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerStarted","Data":"86df93d0069987613d2fe7b8062001295825ab6b48b095bf91cd741ede28cd63"} Mar 18 16:48:47.449332 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:47.448934 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerStarted","Data":"d68b8ad79e1dbc6229ef23cd91805c675cb5bd9029e5107b31570ec0de1c7551"} Mar 18 16:48:47.449332 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:47.448941 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"207ae0e1-58da-43d5-a7e4-475bb668e042","Type":"ContainerStarted","Data":"6cfa511f9bf6ef0ee662d478971d31f1e895ae786c76b1c6833e6b34cb9d7a27"} Mar 18 16:48:47.485723 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:47.484877 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.48483084 podStartE2EDuration="2.48483084s" podCreationTimestamp="2026-03-18 16:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:47.481757362 +0000 UTC m=+251.587408725" watchObservedRunningTime="2026-03-18 16:48:47.48483084 +0000 UTC m=+251.590482240" Mar 18 16:48:48.376629 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:48.376572 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:48:48.379246 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:48.379204 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38dc6da4-4394-4935-80a5-6a872bf72125-metrics-certs\") pod \"network-metrics-daemon-ktpw5\" (UID: \"38dc6da4-4394-4935-80a5-6a872bf72125\") " pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:48:48.618484 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:48.618451 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-97tcn\"" Mar 18 16:48:48.625714 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:48.625672 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktpw5" Mar 18 16:48:48.756723 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:48.756673 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktpw5"] Mar 18 16:48:48.760051 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:48:48.760019 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38dc6da4_4394_4935_80a5_6a872bf72125.slice/crio-38a1c57428f7280f5f97b8b170b0f6b6d237646e24a35e9a80263e888388eb11 WatchSource:0}: Error finding container 38a1c57428f7280f5f97b8b170b0f6b6d237646e24a35e9a80263e888388eb11: Status 404 returned error can't find the container with id 38a1c57428f7280f5f97b8b170b0f6b6d237646e24a35e9a80263e888388eb11 Mar 18 16:48:49.457552 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:49.457506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktpw5" event={"ID":"38dc6da4-4394-4935-80a5-6a872bf72125","Type":"ContainerStarted","Data":"38a1c57428f7280f5f97b8b170b0f6b6d237646e24a35e9a80263e888388eb11"} Mar 18 16:48:50.463147 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:50.463108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktpw5" event={"ID":"38dc6da4-4394-4935-80a5-6a872bf72125","Type":"ContainerStarted","Data":"5094873124f53195fe9472be0e8ca5a5b67742a576d6b184b2e1a5e128612de2"} Mar 18 16:48:50.463147 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:50.463149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktpw5" event={"ID":"38dc6da4-4394-4935-80a5-6a872bf72125","Type":"ContainerStarted","Data":"6fd7bcae0e05722802407cc3031fa7566b500dfce1208a56ddc8138261cfebc1"} Mar 18 16:48:50.482130 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:50.482069 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ktpw5" podStartSLOduration=253.441348701 podStartE2EDuration="4m14.482046367s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:48:48.76198966 +0000 UTC m=+252.867641000" lastFinishedPulling="2026-03-18 16:48:49.802687316 +0000 UTC m=+253.908338666" observedRunningTime="2026-03-18 16:48:50.479639951 +0000 UTC m=+254.585291312" watchObservedRunningTime="2026-03-18 16:48:50.482046367 +0000 UTC m=+254.587697729" Mar 18 16:48:50.816387 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:48:50.816350 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:36.391285 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:36.391254 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:49:36.391852 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:36.391569 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:49:36.404580 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:36.404552 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:49:36.405097 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:36.405059 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:49:36.410578 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:36.410534 2570 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:49:45.816127 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:45.816083 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:45.833378 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:45.833343 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:46.669151 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:49:46.669120 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:51:57.393943 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.393855 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-kjrd6"] Mar 18 16:51:57.397108 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.397090 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kjrd6" Mar 18 16:51:57.399588 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.399562 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qwszx\"" Mar 18 16:51:57.399780 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.399634 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Mar 18 16:51:57.399780 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.399637 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:51:57.400649 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.400634 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:51:57.404641 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.404615 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kjrd6"] Mar 18 16:51:57.510681 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.510641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmlr\" (UniqueName: \"kubernetes.io/projected/cafebdbc-01e1-44c1-b600-699b4f85fd05-kube-api-access-xpmlr\") pod \"s3-init-kjrd6\" (UID: \"cafebdbc-01e1-44c1-b600-699b4f85fd05\") " pod="kserve/s3-init-kjrd6" Mar 18 16:51:57.611395 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.611351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmlr\" (UniqueName: \"kubernetes.io/projected/cafebdbc-01e1-44c1-b600-699b4f85fd05-kube-api-access-xpmlr\") pod \"s3-init-kjrd6\" (UID: \"cafebdbc-01e1-44c1-b600-699b4f85fd05\") " pod="kserve/s3-init-kjrd6" Mar 18 16:51:57.619395 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.619359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmlr\" (UniqueName: \"kubernetes.io/projected/cafebdbc-01e1-44c1-b600-699b4f85fd05-kube-api-access-xpmlr\") pod \"s3-init-kjrd6\" (UID: \"cafebdbc-01e1-44c1-b600-699b4f85fd05\") " pod="kserve/s3-init-kjrd6" Mar 18 16:51:57.719598 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.719504 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kjrd6" Mar 18 16:51:57.852691 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.852653 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kjrd6"] Mar 18 16:51:57.856037 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:51:57.856005 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcafebdbc_01e1_44c1_b600_699b4f85fd05.slice/crio-8d4040608b87ef7721be4bcc6dfa1c99c64312830181f19bcdc7a0f088e48e1b WatchSource:0}: Error finding container 8d4040608b87ef7721be4bcc6dfa1c99c64312830181f19bcdc7a0f088e48e1b: Status 404 returned error can't find the container with id 8d4040608b87ef7721be4bcc6dfa1c99c64312830181f19bcdc7a0f088e48e1b Mar 18 16:51:57.857831 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:57.857808 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:51:58.089615 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:51:58.089570 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kjrd6" event={"ID":"cafebdbc-01e1-44c1-b600-699b4f85fd05","Type":"ContainerStarted","Data":"8d4040608b87ef7721be4bcc6dfa1c99c64312830181f19bcdc7a0f088e48e1b"} Mar 18 16:52:03.110849 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:03.110803 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kjrd6" event={"ID":"cafebdbc-01e1-44c1-b600-699b4f85fd05","Type":"ContainerStarted","Data":"f14b7ee2487f6f02610969bca4d6676ecca3713f1e894e5398b3f9921ad04b6a"} Mar 18 16:52:03.129607 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:03.129539 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-kjrd6" podStartSLOduration=1.53965226 podStartE2EDuration="6.129518417s" podCreationTimestamp="2026-03-18 16:51:57 +0000 UTC" firstStartedPulling="2026-03-18 16:51:57.857939491 +0000 UTC m=+441.963590832" lastFinishedPulling="2026-03-18 16:52:02.447805649 +0000 UTC m=+446.553456989" observedRunningTime="2026-03-18 16:52:03.126759919 +0000 UTC m=+447.232411281" watchObservedRunningTime="2026-03-18 16:52:03.129518417 +0000 UTC m=+447.235169780" Mar 18 16:52:06.122517 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:06.122429 2570 generic.go:358] "Generic (PLEG): container finished" podID="cafebdbc-01e1-44c1-b600-699b4f85fd05" containerID="f14b7ee2487f6f02610969bca4d6676ecca3713f1e894e5398b3f9921ad04b6a" exitCode=0 Mar 18 16:52:06.122908 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:06.122508 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kjrd6" event={"ID":"cafebdbc-01e1-44c1-b600-699b4f85fd05","Type":"ContainerDied","Data":"f14b7ee2487f6f02610969bca4d6676ecca3713f1e894e5398b3f9921ad04b6a"} Mar 18 16:52:07.277861 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:07.277830 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kjrd6" Mar 18 16:52:07.411765 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:07.411652 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpmlr\" (UniqueName: \"kubernetes.io/projected/cafebdbc-01e1-44c1-b600-699b4f85fd05-kube-api-access-xpmlr\") pod \"cafebdbc-01e1-44c1-b600-699b4f85fd05\" (UID: \"cafebdbc-01e1-44c1-b600-699b4f85fd05\") " Mar 18 16:52:07.414085 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:07.414047 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafebdbc-01e1-44c1-b600-699b4f85fd05-kube-api-access-xpmlr" (OuterVolumeSpecName: "kube-api-access-xpmlr") pod "cafebdbc-01e1-44c1-b600-699b4f85fd05" (UID: "cafebdbc-01e1-44c1-b600-699b4f85fd05"). InnerVolumeSpecName "kube-api-access-xpmlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:52:07.513181 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:07.513140 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpmlr\" (UniqueName: \"kubernetes.io/projected/cafebdbc-01e1-44c1-b600-699b4f85fd05-kube-api-access-xpmlr\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:52:08.130436 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:08.130403 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kjrd6" Mar 18 16:52:08.130436 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:08.130430 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kjrd6" event={"ID":"cafebdbc-01e1-44c1-b600-699b4f85fd05","Type":"ContainerDied","Data":"8d4040608b87ef7721be4bcc6dfa1c99c64312830181f19bcdc7a0f088e48e1b"} Mar 18 16:52:08.130724 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:08.130459 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d4040608b87ef7721be4bcc6dfa1c99c64312830181f19bcdc7a0f088e48e1b" Mar 18 16:52:18.017837 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.017798 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-64jx8"] Mar 18 16:52:18.018209 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.018182 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cafebdbc-01e1-44c1-b600-699b4f85fd05" containerName="s3-init" Mar 18 16:52:18.018209 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.018194 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafebdbc-01e1-44c1-b600-699b4f85fd05" containerName="s3-init" Mar 18 16:52:18.018277 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.018252 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="cafebdbc-01e1-44c1-b600-699b4f85fd05" containerName="s3-init" Mar 18 16:52:18.021386 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.021356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-64jx8" Mar 18 16:52:18.023934 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.023907 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:52:18.024090 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.023954 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Mar 18 16:52:18.024090 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.023907 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:52:18.024869 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.024854 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qwszx\"" Mar 18 16:52:18.028567 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.028541 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-64jx8"] Mar 18 16:52:18.110748 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.110678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm5dz\" (UniqueName: \"kubernetes.io/projected/494f89b2-5b12-473f-929d-4fb2fd6b6233-kube-api-access-rm5dz\") pod \"s3-tls-init-custom-64jx8\" (UID: \"494f89b2-5b12-473f-929d-4fb2fd6b6233\") " pod="kserve/s3-tls-init-custom-64jx8" Mar 18 16:52:18.212090 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.212046 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm5dz\" (UniqueName: \"kubernetes.io/projected/494f89b2-5b12-473f-929d-4fb2fd6b6233-kube-api-access-rm5dz\") pod \"s3-tls-init-custom-64jx8\" (UID: \"494f89b2-5b12-473f-929d-4fb2fd6b6233\") " pod="kserve/s3-tls-init-custom-64jx8" Mar 18 16:52:18.221494 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.221452 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm5dz\" (UniqueName: \"kubernetes.io/projected/494f89b2-5b12-473f-929d-4fb2fd6b6233-kube-api-access-rm5dz\") pod \"s3-tls-init-custom-64jx8\" (UID: \"494f89b2-5b12-473f-929d-4fb2fd6b6233\") " pod="kserve/s3-tls-init-custom-64jx8" Mar 18 16:52:18.348098 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.348052 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-64jx8" Mar 18 16:52:18.485366 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:18.485322 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-64jx8"] Mar 18 16:52:18.488967 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:52:18.488935 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod494f89b2_5b12_473f_929d_4fb2fd6b6233.slice/crio-e0f1cb8788608909ee820ec189af7786781bafb67bf0e01f486453f3b2f101d5 WatchSource:0}: Error finding container e0f1cb8788608909ee820ec189af7786781bafb67bf0e01f486453f3b2f101d5: Status 404 returned error can't find the container with id e0f1cb8788608909ee820ec189af7786781bafb67bf0e01f486453f3b2f101d5 Mar 18 16:52:19.168343 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:19.168304 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-64jx8" event={"ID":"494f89b2-5b12-473f-929d-4fb2fd6b6233","Type":"ContainerStarted","Data":"98e8898fbe24be38ac5f2ce82001c7c23558ba30c7b185d1e83a8e0419b35d72"} Mar 18 16:52:19.168343 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:19.168342 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-64jx8" event={"ID":"494f89b2-5b12-473f-929d-4fb2fd6b6233","Type":"ContainerStarted","Data":"e0f1cb8788608909ee820ec189af7786781bafb67bf0e01f486453f3b2f101d5"} Mar 18 16:52:19.185719 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:19.185643 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-64jx8" podStartSLOduration=1.185626637 podStartE2EDuration="1.185626637s" podCreationTimestamp="2026-03-18 16:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:52:19.18385284 +0000 UTC m=+463.289504202" watchObservedRunningTime="2026-03-18 16:52:19.185626637 +0000 UTC m=+463.291277999" Mar 18 16:52:24.186665 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:24.186568 2570 generic.go:358] "Generic (PLEG): container finished" podID="494f89b2-5b12-473f-929d-4fb2fd6b6233" containerID="98e8898fbe24be38ac5f2ce82001c7c23558ba30c7b185d1e83a8e0419b35d72" exitCode=0 Mar 18 16:52:24.186665 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:24.186644 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-64jx8" event={"ID":"494f89b2-5b12-473f-929d-4fb2fd6b6233","Type":"ContainerDied","Data":"98e8898fbe24be38ac5f2ce82001c7c23558ba30c7b185d1e83a8e0419b35d72"} Mar 18 16:52:25.324749 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:25.324717 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-64jx8" Mar 18 16:52:25.379431 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:25.379398 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm5dz\" (UniqueName: \"kubernetes.io/projected/494f89b2-5b12-473f-929d-4fb2fd6b6233-kube-api-access-rm5dz\") pod \"494f89b2-5b12-473f-929d-4fb2fd6b6233\" (UID: \"494f89b2-5b12-473f-929d-4fb2fd6b6233\") " Mar 18 16:52:25.381863 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:25.381826 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/494f89b2-5b12-473f-929d-4fb2fd6b6233-kube-api-access-rm5dz" (OuterVolumeSpecName: "kube-api-access-rm5dz") pod "494f89b2-5b12-473f-929d-4fb2fd6b6233" (UID: "494f89b2-5b12-473f-929d-4fb2fd6b6233"). InnerVolumeSpecName "kube-api-access-rm5dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:52:25.480487 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:25.480386 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rm5dz\" (UniqueName: \"kubernetes.io/projected/494f89b2-5b12-473f-929d-4fb2fd6b6233-kube-api-access-rm5dz\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:52:26.193913 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:26.193882 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-64jx8" Mar 18 16:52:26.194101 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:26.193851 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-64jx8" event={"ID":"494f89b2-5b12-473f-929d-4fb2fd6b6233","Type":"ContainerDied","Data":"e0f1cb8788608909ee820ec189af7786781bafb67bf0e01f486453f3b2f101d5"} Mar 18 16:52:26.194101 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:26.193990 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f1cb8788608909ee820ec189af7786781bafb67bf0e01f486453f3b2f101d5" Mar 18 16:52:28.279694 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.279655 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-8m6mn"] Mar 18 16:52:28.280137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.280047 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="494f89b2-5b12-473f-929d-4fb2fd6b6233" containerName="s3-tls-init-custom" Mar 18 16:52:28.280137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.280059 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="494f89b2-5b12-473f-929d-4fb2fd6b6233" containerName="s3-tls-init-custom" Mar 18 16:52:28.280137 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.280136 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="494f89b2-5b12-473f-929d-4fb2fd6b6233" containerName="s3-tls-init-custom" Mar 18 16:52:28.282234 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.282216 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-8m6mn" Mar 18 16:52:28.284951 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.284922 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Mar 18 16:52:28.285091 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.285002 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qwszx\"" Mar 18 16:52:28.285176 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.285162 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:52:28.285251 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.285234 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:52:28.290946 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.290921 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-8m6mn"] Mar 18 16:52:28.304850 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.304810 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr5n5\" (UniqueName: \"kubernetes.io/projected/82a7c18b-a3f6-44bc-8673-44ce516c61b2-kube-api-access-qr5n5\") pod \"s3-tls-init-serving-8m6mn\" (UID: \"82a7c18b-a3f6-44bc-8673-44ce516c61b2\") " pod="kserve/s3-tls-init-serving-8m6mn" Mar 18 16:52:28.406144 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.406103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qr5n5\" (UniqueName: \"kubernetes.io/projected/82a7c18b-a3f6-44bc-8673-44ce516c61b2-kube-api-access-qr5n5\") pod \"s3-tls-init-serving-8m6mn\" (UID: \"82a7c18b-a3f6-44bc-8673-44ce516c61b2\") " pod="kserve/s3-tls-init-serving-8m6mn" Mar 18 16:52:28.421399 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.421359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr5n5\" (UniqueName: \"kubernetes.io/projected/82a7c18b-a3f6-44bc-8673-44ce516c61b2-kube-api-access-qr5n5\") pod \"s3-tls-init-serving-8m6mn\" (UID: \"82a7c18b-a3f6-44bc-8673-44ce516c61b2\") " pod="kserve/s3-tls-init-serving-8m6mn" Mar 18 16:52:28.605079 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.605040 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-8m6mn" Mar 18 16:52:28.741421 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:28.741346 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-8m6mn"] Mar 18 16:52:28.744418 ip-10-0-129-201 kubenswrapper[2570]: W0318 16:52:28.744387 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82a7c18b_a3f6_44bc_8673_44ce516c61b2.slice/crio-f89b958c8887f33ceb435944643a895986f3873bbe908859a7e1acab1fe6a2ea WatchSource:0}: Error finding container f89b958c8887f33ceb435944643a895986f3873bbe908859a7e1acab1fe6a2ea: Status 404 returned error can't find the container with id f89b958c8887f33ceb435944643a895986f3873bbe908859a7e1acab1fe6a2ea Mar 18 16:52:29.205977 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:29.205942 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-8m6mn" event={"ID":"82a7c18b-a3f6-44bc-8673-44ce516c61b2","Type":"ContainerStarted","Data":"dac368ea9525d8425145deee7ba0a27fb157383f24d999d1bda502b6a4c0957f"} Mar 18 16:52:29.205977 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:29.205981 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-8m6mn" event={"ID":"82a7c18b-a3f6-44bc-8673-44ce516c61b2","Type":"ContainerStarted","Data":"f89b958c8887f33ceb435944643a895986f3873bbe908859a7e1acab1fe6a2ea"} Mar 18 16:52:33.223034 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:33.222932 2570 generic.go:358] "Generic (PLEG): container finished" podID="82a7c18b-a3f6-44bc-8673-44ce516c61b2" containerID="dac368ea9525d8425145deee7ba0a27fb157383f24d999d1bda502b6a4c0957f" exitCode=0 Mar 18 16:52:33.223034 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:33.223015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-8m6mn" event={"ID":"82a7c18b-a3f6-44bc-8673-44ce516c61b2","Type":"ContainerDied","Data":"dac368ea9525d8425145deee7ba0a27fb157383f24d999d1bda502b6a4c0957f"} Mar 18 16:52:34.362374 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:34.362345 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-8m6mn" Mar 18 16:52:34.465479 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:34.465436 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr5n5\" (UniqueName: \"kubernetes.io/projected/82a7c18b-a3f6-44bc-8673-44ce516c61b2-kube-api-access-qr5n5\") pod \"82a7c18b-a3f6-44bc-8673-44ce516c61b2\" (UID: \"82a7c18b-a3f6-44bc-8673-44ce516c61b2\") " Mar 18 16:52:34.467980 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:34.467953 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a7c18b-a3f6-44bc-8673-44ce516c61b2-kube-api-access-qr5n5" (OuterVolumeSpecName: "kube-api-access-qr5n5") pod "82a7c18b-a3f6-44bc-8673-44ce516c61b2" (UID: "82a7c18b-a3f6-44bc-8673-44ce516c61b2"). InnerVolumeSpecName "kube-api-access-qr5n5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:52:34.567129 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:34.567091 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qr5n5\" (UniqueName: \"kubernetes.io/projected/82a7c18b-a3f6-44bc-8673-44ce516c61b2-kube-api-access-qr5n5\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 16:52:35.231499 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:35.231464 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-8m6mn" event={"ID":"82a7c18b-a3f6-44bc-8673-44ce516c61b2","Type":"ContainerDied","Data":"f89b958c8887f33ceb435944643a895986f3873bbe908859a7e1acab1fe6a2ea"} Mar 18 16:52:35.231499 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:35.231501 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f89b958c8887f33ceb435944643a895986f3873bbe908859a7e1acab1fe6a2ea" Mar 18 16:52:35.231723 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:52:35.231508 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-8m6mn" Mar 18 16:54:36.429884 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:54:36.429854 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:54:36.431609 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:54:36.431580 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:54:36.434842 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:54:36.434814 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:54:36.436506 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:54:36.436484 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:59:36.459451 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:59:36.459361 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:59:36.462749 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:59:36.462723 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 16:59:36.464712 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:59:36.464669 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 16:59:36.467781 ip-10-0-129-201 kubenswrapper[2570]: I0318 16:59:36.467752 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:04:36.486556 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:04:36.486525 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:04:36.491091 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:04:36.491065 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:04:36.491449 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:04:36.491432 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:04:36.496497 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:04:36.496475 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:09:36.513760 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:09:36.513733 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:09:36.518752 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:09:36.518724 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:09:36.520264 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:09:36.520240 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:09:36.525203 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:09:36.525183 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:14:36.542724 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:14:36.542667 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:14:36.548351 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:14:36.548307 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:14:36.548782 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:14:36.548740 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:14:36.559876 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:14:36.559850 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:19:36.580031 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:19:36.580000 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:19:36.585119 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:19:36.585087 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:19:36.585325 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:19:36.585259 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:19:36.590305 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:19:36.590281 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:24:36.608014 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:24:36.607981 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:24:36.613403 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:24:36.613370 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:24:36.613582 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:24:36.613537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:24:36.618453 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:24:36.618428 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:29:36.635633 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:29:36.635546 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:29:36.640632 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:29:36.640609 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:29:36.641639 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:29:36.641616 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:29:36.646362 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:29:36.646340 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:34:36.667588 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:34:36.667553 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:34:36.673682 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:34:36.673650 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:34:36.676437 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:34:36.676410 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:34:36.681434 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:34:36.681406 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:39:36.699220 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:39:36.699191 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:39:36.703721 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:39:36.703672 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:39:36.705173 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:39:36.705153 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:39:36.709551 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:39:36.709532 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:44:36.728133 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:44:36.728098 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:44:36.733290 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:44:36.733265 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:44:36.734199 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:44:36.734173 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:44:36.738932 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:44:36.738912 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:46:10.528013 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.527978 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hjr4p/must-gather-9q8dv"] Mar 18 17:46:10.528500 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.528334 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a7c18b-a3f6-44bc-8673-44ce516c61b2" containerName="s3-tls-init-serving" Mar 18 17:46:10.528500 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.528345 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a7c18b-a3f6-44bc-8673-44ce516c61b2" containerName="s3-tls-init-serving" Mar 18 17:46:10.528500 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.528415 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a7c18b-a3f6-44bc-8673-44ce516c61b2" containerName="s3-tls-init-serving" Mar 18 17:46:10.531606 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.531584 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:10.537883 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.537851 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjr4p\"/\"openshift-service-ca.crt\"" Mar 18 17:46:10.538036 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.537861 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjr4p\"/\"kube-root-ca.crt\"" Mar 18 17:46:10.544403 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.544375 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjr4p/must-gather-9q8dv"] Mar 18 17:46:10.645315 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.645272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6k9\" (UniqueName: \"kubernetes.io/projected/183c6576-bb2b-438d-9309-7f9d60174ae4-kube-api-access-7m6k9\") pod \"must-gather-9q8dv\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:10.645518 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.645441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/183c6576-bb2b-438d-9309-7f9d60174ae4-must-gather-output\") pod \"must-gather-9q8dv\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:10.746177 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.746132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/183c6576-bb2b-438d-9309-7f9d60174ae4-must-gather-output\") pod \"must-gather-9q8dv\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:10.746358 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.746215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6k9\" (UniqueName: \"kubernetes.io/projected/183c6576-bb2b-438d-9309-7f9d60174ae4-kube-api-access-7m6k9\") pod \"must-gather-9q8dv\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:10.746496 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.746475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/183c6576-bb2b-438d-9309-7f9d60174ae4-must-gather-output\") pod \"must-gather-9q8dv\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:10.756186 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.756154 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6k9\" (UniqueName: \"kubernetes.io/projected/183c6576-bb2b-438d-9309-7f9d60174ae4-kube-api-access-7m6k9\") pod \"must-gather-9q8dv\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:10.852358 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:10.852314 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:11.002235 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:11.002190 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjr4p/must-gather-9q8dv"] Mar 18 17:46:11.005637 ip-10-0-129-201 kubenswrapper[2570]: W0318 17:46:11.005607 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod183c6576_bb2b_438d_9309_7f9d60174ae4.slice/crio-40bd947ea681b43756a966206e5838afe48f698813687779f21c361f5e91e3c2 WatchSource:0}: Error finding container 40bd947ea681b43756a966206e5838afe48f698813687779f21c361f5e91e3c2: Status 404 returned error can't find the container with id 40bd947ea681b43756a966206e5838afe48f698813687779f21c361f5e91e3c2 Mar 18 17:46:11.007803 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:11.007779 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:46:11.822163 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:11.822096 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" event={"ID":"183c6576-bb2b-438d-9309-7f9d60174ae4","Type":"ContainerStarted","Data":"40bd947ea681b43756a966206e5838afe48f698813687779f21c361f5e91e3c2"} Mar 18 17:46:16.850644 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:16.849977 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" event={"ID":"183c6576-bb2b-438d-9309-7f9d60174ae4","Type":"ContainerStarted","Data":"00c4ef7be4a2f06f68ed6cebcc7d8c814be76ae27418fb6f3a3065b075c32224"} Mar 18 17:46:16.850644 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:16.850030 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" event={"ID":"183c6576-bb2b-438d-9309-7f9d60174ae4","Type":"ContainerStarted","Data":"c325328fb0c40573534f5a8b2b16a8d9a216087a7d8ccf11abb2a0a281630546"} Mar 18 17:46:16.870427 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:16.870362 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" podStartSLOduration=2.0992948240000002 podStartE2EDuration="6.870343053s" podCreationTimestamp="2026-03-18 17:46:10 +0000 UTC" firstStartedPulling="2026-03-18 17:46:11.007941727 +0000 UTC m=+3695.113593068" lastFinishedPulling="2026-03-18 17:46:15.778989953 +0000 UTC m=+3699.884641297" observedRunningTime="2026-03-18 17:46:16.868921628 +0000 UTC m=+3700.974573010" watchObservedRunningTime="2026-03-18 17:46:16.870343053 +0000 UTC m=+3700.975994414" Mar 18 17:46:37.935612 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:37.935576 2570 generic.go:358] "Generic (PLEG): container finished" podID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerID="c325328fb0c40573534f5a8b2b16a8d9a216087a7d8ccf11abb2a0a281630546" exitCode=0 Mar 18 17:46:37.936086 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:37.935629 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" event={"ID":"183c6576-bb2b-438d-9309-7f9d60174ae4","Type":"ContainerDied","Data":"c325328fb0c40573534f5a8b2b16a8d9a216087a7d8ccf11abb2a0a281630546"} Mar 18 17:46:37.936086 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:37.935938 2570 scope.go:117] "RemoveContainer" containerID="c325328fb0c40573534f5a8b2b16a8d9a216087a7d8ccf11abb2a0a281630546" Mar 18 17:46:38.320166 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:38.320134 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjr4p_must-gather-9q8dv_183c6576-bb2b-438d-9309-7f9d60174ae4/gather/0.log" Mar 18 17:46:42.258486 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:42.258447 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hdpjw_f033a5d8-a3ec-47df-9593-01096396aeb5/global-pull-secret-syncer/0.log" Mar 18 17:46:42.461490 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:42.461459 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-f4pdp_4e1a7f7c-08f4-4781-a4aa-34e19ba7b69a/konnectivity-agent/0.log" Mar 18 17:46:42.486453 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:42.486421 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-201.ec2.internal_7a62256ddb99a5d51981b06e48f8ed26/haproxy/0.log" Mar 18 17:46:43.817900 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:43.817863 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hjr4p/must-gather-9q8dv"] Mar 18 17:46:43.818390 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:43.818086 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerName="copy" containerID="cri-o://00c4ef7be4a2f06f68ed6cebcc7d8c814be76ae27418fb6f3a3065b075c32224" gracePeriod=2 Mar 18 17:46:43.820379 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:43.820336 2570 status_manager.go:895] "Failed to get status for pod" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" err="pods \"must-gather-9q8dv\" is forbidden: User \"system:node:ip-10-0-129-201.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjr4p\": no relationship found between node 'ip-10-0-129-201.ec2.internal' and this object" Mar 18 17:46:43.821110 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:43.821081 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hjr4p/must-gather-9q8dv"] Mar 18 17:46:43.960030 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:43.960006 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjr4p_must-gather-9q8dv_183c6576-bb2b-438d-9309-7f9d60174ae4/copy/0.log" Mar 18 17:46:43.960392 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:43.960360 2570 generic.go:358] "Generic (PLEG): container finished" podID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerID="00c4ef7be4a2f06f68ed6cebcc7d8c814be76ae27418fb6f3a3065b075c32224" exitCode=143 Mar 18 17:46:44.065008 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.064983 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjr4p_must-gather-9q8dv_183c6576-bb2b-438d-9309-7f9d60174ae4/copy/0.log" Mar 18 17:46:44.065373 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.065357 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:44.067459 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.067433 2570 status_manager.go:895] "Failed to get status for pod" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" err="pods \"must-gather-9q8dv\" is forbidden: User \"system:node:ip-10-0-129-201.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjr4p\": no relationship found between node 'ip-10-0-129-201.ec2.internal' and this object" Mar 18 17:46:44.169502 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.169406 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/183c6576-bb2b-438d-9309-7f9d60174ae4-must-gather-output\") pod \"183c6576-bb2b-438d-9309-7f9d60174ae4\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " Mar 18 17:46:44.169502 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.169467 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6k9\" (UniqueName: \"kubernetes.io/projected/183c6576-bb2b-438d-9309-7f9d60174ae4-kube-api-access-7m6k9\") pod \"183c6576-bb2b-438d-9309-7f9d60174ae4\" (UID: \"183c6576-bb2b-438d-9309-7f9d60174ae4\") " Mar 18 17:46:44.171043 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.171008 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183c6576-bb2b-438d-9309-7f9d60174ae4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "183c6576-bb2b-438d-9309-7f9d60174ae4" (UID: "183c6576-bb2b-438d-9309-7f9d60174ae4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:46:44.172092 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.172065 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183c6576-bb2b-438d-9309-7f9d60174ae4-kube-api-access-7m6k9" (OuterVolumeSpecName: "kube-api-access-7m6k9") pod "183c6576-bb2b-438d-9309-7f9d60174ae4" (UID: "183c6576-bb2b-438d-9309-7f9d60174ae4"). InnerVolumeSpecName "kube-api-access-7m6k9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:46:44.270941 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.270907 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/183c6576-bb2b-438d-9309-7f9d60174ae4-must-gather-output\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 17:46:44.270941 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.270944 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7m6k9\" (UniqueName: \"kubernetes.io/projected/183c6576-bb2b-438d-9309-7f9d60174ae4-kube-api-access-7m6k9\") on node \"ip-10-0-129-201.ec2.internal\" DevicePath \"\"" Mar 18 17:46:44.519570 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.519486 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" path="/var/lib/kubelet/pods/183c6576-bb2b-438d-9309-7f9d60174ae4/volumes" Mar 18 17:46:44.964941 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.964912 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjr4p_must-gather-9q8dv_183c6576-bb2b-438d-9309-7f9d60174ae4/copy/0.log" Mar 18 17:46:44.965359 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.965288 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjr4p/must-gather-9q8dv" Mar 18 17:46:44.965402 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.965289 2570 scope.go:117] "RemoveContainer" containerID="00c4ef7be4a2f06f68ed6cebcc7d8c814be76ae27418fb6f3a3065b075c32224" Mar 18 17:46:44.973861 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:44.973836 2570 scope.go:117] "RemoveContainer" containerID="c325328fb0c40573534f5a8b2b16a8d9a216087a7d8ccf11abb2a0a281630546" Mar 18 17:46:45.764494 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.764455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4f3b35b3-c3ae-478c-92b4-16e87ffd743e/alertmanager/0.log" Mar 18 17:46:45.790387 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.790356 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4f3b35b3-c3ae-478c-92b4-16e87ffd743e/config-reloader/0.log" Mar 18 17:46:45.812219 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.812181 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4f3b35b3-c3ae-478c-92b4-16e87ffd743e/kube-rbac-proxy-web/0.log" Mar 18 17:46:45.836355 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.836330 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4f3b35b3-c3ae-478c-92b4-16e87ffd743e/kube-rbac-proxy/0.log" Mar 18 17:46:45.861443 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.861416 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4f3b35b3-c3ae-478c-92b4-16e87ffd743e/kube-rbac-proxy-metric/0.log" Mar 18 17:46:45.888154 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.888119 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4f3b35b3-c3ae-478c-92b4-16e87ffd743e/prom-label-proxy/0.log" Mar 18 17:46:45.911106 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.911076 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4f3b35b3-c3ae-478c-92b4-16e87ffd743e/init-config-reloader/0.log" Mar 18 17:46:45.988294 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:45.988182 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-fpf7x_74b771c1-e823-4290-b421-6cb942a7ae44/kube-state-metrics/0.log" Mar 18 17:46:46.012727 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.012672 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-fpf7x_74b771c1-e823-4290-b421-6cb942a7ae44/kube-rbac-proxy-main/0.log" Mar 18 17:46:46.038024 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.037991 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-fpf7x_74b771c1-e823-4290-b421-6cb942a7ae44/kube-rbac-proxy-self/0.log" Mar 18 17:46:46.068212 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.068181 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-b75c4fc57-7wkh4_211a4dff-a127-4785-a371-3aec44fa3a84/metrics-server/0.log" Mar 18 17:46:46.093252 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.093220 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6d47bdb78d-ghr2r_f2e621ea-1527-4a57-af6a-4a71da84cf37/monitoring-plugin/0.log" Mar 18 17:46:46.208414 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.208384 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r5scn_89e76f6d-b92d-47b3-b48e-389e2a9574b6/node-exporter/0.log" Mar 18 17:46:46.245793 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.245689 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r5scn_89e76f6d-b92d-47b3-b48e-389e2a9574b6/kube-rbac-proxy/0.log" Mar 18 17:46:46.296043 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.296017 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r5scn_89e76f6d-b92d-47b3-b48e-389e2a9574b6/init-textfile/0.log" Mar 18 17:46:46.407993 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.407956 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-jmc2v_1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8/kube-rbac-proxy-main/0.log" Mar 18 17:46:46.433340 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.433315 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-jmc2v_1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8/kube-rbac-proxy-self/0.log" Mar 18 17:46:46.458339 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.458313 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-jmc2v_1be80e0a-d7cc-40fa-8ab9-5da6eceaddc8/openshift-state-metrics/0.log" Mar 18 17:46:46.508201 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.508125 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_207ae0e1-58da-43d5-a7e4-475bb668e042/prometheus/0.log" Mar 18 17:46:46.529025 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.529001 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_207ae0e1-58da-43d5-a7e4-475bb668e042/config-reloader/0.log" Mar 18 17:46:46.558188 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.558164 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_207ae0e1-58da-43d5-a7e4-475bb668e042/thanos-sidecar/0.log" Mar 18 17:46:46.584793 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.584762 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_207ae0e1-58da-43d5-a7e4-475bb668e042/kube-rbac-proxy-web/0.log" Mar 18 17:46:46.610566 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.610537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_207ae0e1-58da-43d5-a7e4-475bb668e042/kube-rbac-proxy/0.log" Mar 18 17:46:46.642231 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.642202 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_207ae0e1-58da-43d5-a7e4-475bb668e042/kube-rbac-proxy-thanos/0.log" Mar 18 17:46:46.667255 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.667231 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_207ae0e1-58da-43d5-a7e4-475bb668e042/init-config-reloader/0.log" Mar 18 17:46:46.761377 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.761298 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8444df798b-pksvx_7d73f442-384c-4654-89da-c1341a2fac11/prometheus-operator-admission-webhook/0.log" Mar 18 17:46:46.794340 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.794312 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-df7957d7c-svr72_4a426097-9a72-428e-944c-7f658f3a6f6f/telemeter-client/0.log" Mar 18 17:46:46.820634 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.820609 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-df7957d7c-svr72_4a426097-9a72-428e-944c-7f658f3a6f6f/reload/0.log" Mar 18 17:46:46.847847 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.847818 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-df7957d7c-svr72_4a426097-9a72-428e-944c-7f658f3a6f6f/kube-rbac-proxy/0.log" Mar 18 17:46:46.891548 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.891520 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/thanos-query/0.log" Mar 18 17:46:46.916637 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.916608 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy-web/0.log" Mar 18 17:46:46.941245 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.941218 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy/0.log" Mar 18 17:46:46.967086 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.967062 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/prom-label-proxy/0.log" Mar 18 17:46:46.991460 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:46.991432 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy-rules/0.log" Mar 18 17:46:47.024762 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:47.024674 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c4d988b9b-7xpv2_0ccf7ab2-3fd9-47f2-94af-3c2b3e77d0c9/kube-rbac-proxy-metrics/0.log" Mar 18 17:46:48.285058 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:48.285027 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-55b77584bb-tb98w_f9f34891-1c31-4c9e-9365-63ede3d6127d/networking-console-plugin/0.log" Mar 18 17:46:48.719506 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:48.719422 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/2.log" Mar 18 17:46:48.723957 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:48.723934 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-gjg6w_0c78a76c-298c-468a-a6bd-98bc2950f67a/console-operator/3.log" Mar 18 17:46:49.186842 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.186807 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk"] Mar 18 17:46:49.187180 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.187166 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerName="copy" Mar 18 17:46:49.187232 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.187182 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerName="copy" Mar 18 17:46:49.187232 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.187205 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerName="gather" Mar 18 17:46:49.187232 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.187210 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerName="gather" Mar 18 17:46:49.187359 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.187278 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerName="gather" Mar 18 17:46:49.187359 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.187287 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="183c6576-bb2b-438d-9309-7f9d60174ae4" containerName="copy" Mar 18 17:46:49.192398 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.192370 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.194896 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.194872 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxd9\"/\"openshift-service-ca.crt\"" Mar 18 17:46:49.195864 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.195840 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxd9\"/\"kube-root-ca.crt\"" Mar 18 17:46:49.195992 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.195840 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vxxd9\"/\"default-dockercfg-w25nj\"" Mar 18 17:46:49.202432 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.202405 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk"] Mar 18 17:46:49.317841 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.317799 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-sys\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.317841 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.317838 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-proc\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.318255 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.317856 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-podres\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.318255 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.317989 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-lib-modules\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.318255 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.318101 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68p27\" (UniqueName: \"kubernetes.io/projected/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-kube-api-access-68p27\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419066 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68p27\" (UniqueName: \"kubernetes.io/projected/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-kube-api-access-68p27\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419066 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-sys\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419342 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-proc\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419342 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-podres\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419342 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-lib-modules\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419342 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-proc\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419342 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-sys\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419342 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419284 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-podres\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.419342 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.419287 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-lib-modules\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.430479 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.430442 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68p27\" (UniqueName: \"kubernetes.io/projected/1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80-kube-api-access-68p27\") pod \"perf-node-gather-daemonset-6b8nk\" (UID: \"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.503513 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.503416 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:49.638906 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.638877 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk"] Mar 18 17:46:49.642037 ip-10-0-129-201 kubenswrapper[2570]: W0318 17:46:49.642008 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1d1ee2bb_d449_465c_b3c6_62e5b7bfbd80.slice/crio-ef30f93c6f9efcf83a18b613d3ede2065dfcf22d44e6a429d001a2f03f3f479c WatchSource:0}: Error finding container ef30f93c6f9efcf83a18b613d3ede2065dfcf22d44e6a429d001a2f03f3f479c: Status 404 returned error can't find the container with id ef30f93c6f9efcf83a18b613d3ede2065dfcf22d44e6a429d001a2f03f3f479c Mar 18 17:46:49.986250 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.986213 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" event={"ID":"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80","Type":"ContainerStarted","Data":"a0bb9d9b3e83d4210d94642ff740164f731f886b3fdffb1f55cec1c9232be0bf"} Mar 18 17:46:49.986250 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.986258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" event={"ID":"1d1ee2bb-d449-465c-b3c6-62e5b7bfbd80","Type":"ContainerStarted","Data":"ef30f93c6f9efcf83a18b613d3ede2065dfcf22d44e6a429d001a2f03f3f479c"} Mar 18 17:46:49.986456 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:49.986405 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:46:50.006067 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:50.006013 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" podStartSLOduration=1.005998917 podStartE2EDuration="1.005998917s" podCreationTimestamp="2026-03-18 17:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:46:50.003923462 +0000 UTC m=+3734.109574825" watchObservedRunningTime="2026-03-18 17:46:50.005998917 +0000 UTC m=+3734.111650279" Mar 18 17:46:50.395019 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:50.394982 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t4psl_d386ca96-8632-46cd-b756-90a53fad9ef1/dns/0.log" Mar 18 17:46:50.421521 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:50.421492 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t4psl_d386ca96-8632-46cd-b756-90a53fad9ef1/kube-rbac-proxy/0.log" Mar 18 17:46:50.447166 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:50.447136 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-blj9x_896404ec-1cea-4a36-9c02-cb5316bac310/dns-node-resolver/0.log" Mar 18 17:46:50.957208 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:50.957182 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8mt8g_aba28703-3193-48ac-bad1-170ac214d793/node-ca/0.log" Mar 18 17:46:51.877088 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:51.877055 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bd4c46cc4-vslwk_7b285194-6029-4441-b4e2-56fdcc973573/router/0.log" Mar 18 17:46:52.210329 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:52.210252 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4vc72_529555a4-f4da-4842-814f-1acffad52caf/serve-healthcheck-canary/0.log" Mar 18 17:46:52.620718 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:52.620674 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-p44bj_b81d0e21-b085-449c-848a-8150e032f670/insights-operator/0.log" Mar 18 17:46:52.622608 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:52.622587 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-p44bj_b81d0e21-b085-449c-848a-8150e032f670/insights-operator/1.log" Mar 18 17:46:52.647255 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:52.647227 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-55qz2_477b54c8-2376-4b83-b755-27327a399096/kube-rbac-proxy/0.log" Mar 18 17:46:52.670928 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:52.670901 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-55qz2_477b54c8-2376-4b83-b755-27327a399096/exporter/0.log" Mar 18 17:46:52.695688 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:52.695660 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-55qz2_477b54c8-2376-4b83-b755-27327a399096/extractor/0.log" Mar 18 17:46:55.176798 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:55.176765 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-kjrd6_cafebdbc-01e1-44c1-b600-699b4f85fd05/s3-init/0.log" Mar 18 17:46:55.213930 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:55.213900 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-64jx8_494f89b2-5b12-473f-929d-4fb2fd6b6233/s3-tls-init-custom/0.log" Mar 18 17:46:55.247597 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:55.247568 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-8m6mn_82a7c18b-a3f6-44bc-8673-44ce516c61b2/s3-tls-init-serving/0.log" Mar 18 17:46:56.001218 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:46:56.001187 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6b8nk" Mar 18 17:47:01.143019 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.142982 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ptdgh_823ea246-d154-4a18-b04f-221eec27416d/kube-multus-additional-cni-plugins/0.log" Mar 18 17:47:01.170343 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.170317 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ptdgh_823ea246-d154-4a18-b04f-221eec27416d/egress-router-binary-copy/0.log" Mar 18 17:47:01.197757 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.197730 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ptdgh_823ea246-d154-4a18-b04f-221eec27416d/cni-plugins/0.log" Mar 18 17:47:01.222476 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.222446 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ptdgh_823ea246-d154-4a18-b04f-221eec27416d/bond-cni-plugin/0.log" Mar 18 17:47:01.260915 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.260884 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ptdgh_823ea246-d154-4a18-b04f-221eec27416d/routeoverride-cni/0.log" Mar 18 17:47:01.284866 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.284839 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ptdgh_823ea246-d154-4a18-b04f-221eec27416d/whereabouts-cni-bincopy/0.log" Mar 18 17:47:01.313016 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.312990 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ptdgh_823ea246-d154-4a18-b04f-221eec27416d/whereabouts-cni/0.log" Mar 18 17:47:01.596311 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.596275 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cnpd6_f7a45c76-25de-47f6-8a92-fe9ea77a8a9c/kube-multus/0.log" Mar 18 17:47:01.681620 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.681582 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ktpw5_38dc6da4-4394-4935-80a5-6a872bf72125/network-metrics-daemon/0.log" Mar 18 17:47:01.710066 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:01.710032 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ktpw5_38dc6da4-4394-4935-80a5-6a872bf72125/kube-rbac-proxy/0.log" Mar 18 17:47:02.829156 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:02.829114 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-controller/0.log" Mar 18 17:47:02.849283 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:02.849256 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/0.log" Mar 18 17:47:02.867022 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:02.866984 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovn-acl-logging/1.log" Mar 18 17:47:02.890033 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:02.890006 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/kube-rbac-proxy-node/0.log" Mar 18 17:47:02.914430 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:02.914396 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:47:02.942907 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:02.942871 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/northd/0.log" Mar 18 17:47:02.971964 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:02.971939 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/nbdb/0.log" Mar 18 17:47:03.000752 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:03.000713 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/sbdb/0.log" Mar 18 17:47:03.105843 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:03.105752 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gk7ln_935b66df-6c0c-487a-a4ff-9539cb02c34d/ovnkube-controller/0.log" Mar 18 17:47:04.791340 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:04.791306 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-cc88fdd44-km9x2_548137bf-b85f-4d92-9e10-f9cc858486fb/check-endpoints/0.log" Mar 18 17:47:04.873712 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:04.873675 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-x724s_b64a1006-8f55-41c1-9d77-457180e9a557/network-check-target-container/0.log" Mar 18 17:47:05.968377 ip-10-0-129-201 kubenswrapper[2570]: I0318 17:47:05.968343 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9bgx9_aa591bdc-c7e1-4131-b41c-dd5043afacbf/iptables-alerter/0.log"