Apr 17 11:16:32.831395 ip-10-0-142-247 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:33.284986 ip-10-0-142-247 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:33.284986 ip-10-0-142-247 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:33.284986 ip-10-0-142-247 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:33.284986 ip-10-0-142-247 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:33.284986 ip-10-0-142-247 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:33.287785 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.287700 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:33.292381 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292366 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:33.292381 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292381 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292385 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292388 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292391 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292394 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292397 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292400 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292403 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292406 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292408 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292411 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292414 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292417 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292419 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292422 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292424 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292427 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292430 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292432 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292435 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:33.292445 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292438 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292440 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292443 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292445 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292448 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292451 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292453 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292457 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292459 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292462 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292464 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292467 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292470 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292472 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292475 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292478 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292480 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292482 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292485 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292488 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:33.293002 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292490 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292493 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292495 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292498 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292501 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292503 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292506 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292508 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292510 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292515 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292519 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292522 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292525 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292527 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292531 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292533 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292536 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292539 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292542 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:33.293538 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292545 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292548 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292551 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292554 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292557 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292559 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292562 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292566 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292569 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292572 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292574 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292577 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292580 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292583 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292585 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292588 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292591 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292594 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292596 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292599 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:33.294011 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292602 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292604 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292607 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292611 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292614 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292618 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292988 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292993 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292996 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.292999 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293002 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293005 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293008 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293012 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293015 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293018 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293020 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293023 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293025 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293028 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:33.294518 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293031 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293034 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293036 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293039 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293042 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293044 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293047 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293050 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293053 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293055 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293058 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293060 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293063 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293065 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293068 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293070 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293073 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293075 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293078 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293081 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:33.294993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293084 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293086 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293089 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293091 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293094 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293097 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293100 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293103 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293105 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293122 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293126 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293129 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293132 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293135 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293138 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293141 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293144 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293146 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293149 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:33.295500 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293152 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293154 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293157 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293159 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293162 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293164 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293167 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293170 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293173 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293176 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293178 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293181 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293184 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293186 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293189 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293191 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293194 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293196 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293199 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293201 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:33.296014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293205 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293208 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293211 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293213 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293215 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293218 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293220 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293223 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293225 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293228 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293232 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293237 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293240 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293318 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293325 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293330 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293335 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293339 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293342 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293347 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:33.296513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293352 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293355 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293358 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293361 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293364 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293367 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293370 2577 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293373 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293376 2577 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293379 2577 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293382 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293385 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293391 2577 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293395 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293398 2577 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293401 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293404 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293408 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293411 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293414 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293417 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293420 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293423 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293426 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293429 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:33.297003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293432 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293436 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293440 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293443 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293446 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293448 2577 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293451 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293455 2577 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293458 2577 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293461 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293464 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293467 2577 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293471 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293474 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293477 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293480 2577 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293482 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293485 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293488 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293492 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293495 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293499 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293502 2577 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293506 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293509 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:33.297617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293513 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293516 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293519 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293522 2577 flags.go:64] FLAG: --help="false" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293525 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293529 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293532 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293535 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293538 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293542 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293546 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293549 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293552 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293555 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293558 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293561 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293564 2577 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293567 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293570 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293573 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293576 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293578 2577 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293581 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293584 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:33.298234 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293587 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293592 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293595 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293598 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293601 2577 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293604 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293608 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293611 2577 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293613 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293618 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293621 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293625 2577 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293628 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293631 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293634 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293637 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293640 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293643 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293646 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293653 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293656 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293660 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293663 2577 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:33.298828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293666 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293671 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293674 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293678 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293681 2577 flags.go:64] FLAG: --port="10250" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293684 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293687 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0386f21d5a4e138a2" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293690 2577 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293693 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293696 2577 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293698 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293701 2577 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293705 2577 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293708 2577 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293711 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293714 2577 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293718 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293721 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293724 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293727 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293730 2577 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293733 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293737 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293739 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293742 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293745 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:33.299400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293748 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293751 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293754 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293757 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293760 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293763 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293766 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293769 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293772 2577 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293774 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293780 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293784 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293786 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293797 2577 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293800 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293802 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293805 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293808 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293812 2577 flags.go:64] FLAG: --v="2" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293816 2577 flags.go:64] FLAG: --version="false" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293820 2577 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293827 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.293831 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293916 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293919 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:33.300032 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293923 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293926 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293929 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293931 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293934 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293937 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293940 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293942 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293945 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293948 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293950 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293953 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293955 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293958 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293960 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293964 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293968 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293971 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293973 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:33.300653 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293976 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293978 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293981 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293983 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293986 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293989 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293991 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293994 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.293996 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294000 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294003 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294006 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294008 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294011 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294015 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294018 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294021 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294024 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294026 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:33.301167 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294028 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294031 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294034 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294037 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294039 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294042 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294044 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294047 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294049 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294052 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294056 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294058 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294061 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294063 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294066 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294068 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294071 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294073 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294076 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294078 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:33.301633 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294080 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294083 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294087 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294089 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294092 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294095 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294097 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294100 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294102 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294105 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294120 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294123 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294126 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294128 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294131 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294134 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294137 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294139 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294141 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294157 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:33.302262 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294160 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:33.303033 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294162 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:33.303033 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294166 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:33.303033 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294169 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:33.303033 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294172 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:33.303033 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.294174 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:33.303033 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.295133 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:33.304284 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.304265 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:33.304284 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.304284 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304333 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304339 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304342 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304345 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304348 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304351 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304354 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304356 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304359 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304362 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304364 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:33.304362 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304367 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304370 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304373 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304376 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304378 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304381 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304384 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304386 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304389 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304391 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304394 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304396 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304399 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304401 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304404 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304406 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304409 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304412 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304415 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304418 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:33.304667 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304420 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304423 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304425 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304428 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304430 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304433 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304437 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304441 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304444 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304447 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304450 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304453 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304455 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304458 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304462 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304465 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304468 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304471 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304474 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304477 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:33.305179 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304479 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304482 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304485 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304488 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304490 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304493 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304495 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304498 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304500 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304504 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304506 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304509 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304512 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304514 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304517 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304519 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304522 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304524 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304527 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:33.305678 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304529 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304532 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304534 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304537 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304540 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304543 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304546 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304549 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304552 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304556 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304559 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304562 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304565 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304568 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304570 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:33.306164 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304573 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.304578 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304680 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304685 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304688 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304691 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304694 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304697 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304700 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304703 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304706 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304709 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304712 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304716 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304720 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:33.306571 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304723 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304726 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304729 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304732 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304735 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304738 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304740 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304743 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304746 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304749 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304752 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304755 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304758 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304760 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304762 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304765 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304768 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304770 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304773 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304775 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:33.306956 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304778 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304780 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304783 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304785 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304787 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304790 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304792 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304795 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304797 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304800 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304803 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304806 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304809 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304811 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304813 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304816 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304818 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304821 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304823 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304826 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:33.307525 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304829 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304832 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304835 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304837 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304840 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304843 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304846 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304849 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304851 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304854 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304857 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304859 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304862 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304864 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304867 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304869 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304872 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304874 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304876 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:33.308014 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304879 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304881 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304884 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304886 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304889 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304891 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304893 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304896 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304898 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304901 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304904 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304906 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304909 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:33.304911 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.304916 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:33.308487 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.305763 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:33.309831 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.309817 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:33.310810 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.310798 2577 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:33.310905 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.310889 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:33.310940 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.310930 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:33.336597 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.336580 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:33.341285 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.341265 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:33.351059 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.351041 2577 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:33.356924 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.356909 2577 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:33.361491 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.361475 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:33.365759 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.365739 2577 fs.go:135] Filesystem UUIDs: map[4931f1b6-634d-422d-985a-fe06f997a0b2:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 91eeedfe-2ee8-43b7-99f4-24ff21d87145:/dev/nvme0n1p4] Apr 17 11:16:33.365829 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.365759 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:33.366183 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.366168 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:33.371497 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.371383 2577 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:33.370101695 +0000 UTC m=+0.418271573 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098713 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a5daabfc1666dde940b4a427427f3 SystemUUID:ec2a5daa-bfc1-666d-de94-0b4a427427f3 BootID:ed6b733b-af18-45f2-b9be-30e8ef097958 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:38:48:c9:20:71 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:38:48:c9:20:71 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:4d:64:3f:68:53 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:33.371497 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.371492 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:33.371612 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.371569 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:33.373559 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.373531 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:33.374028 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.373576 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-247.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:33.374134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.374037 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:33.374134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.374048 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:33.374134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.374062 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:33.374904 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.374894 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:33.375711 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.375701 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:33.375815 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.375807 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:33.378803 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.378794 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:33.378841 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.378813 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:33.378841 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.378830 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:33.378841 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.378841 2577 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:33.378929 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.378852 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:33.379964 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.379953 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:33.380006 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.379971 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:33.383098 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.383083 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:33.384890 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.384877 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:33.386236 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386223 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:33.386283 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386245 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:33.386283 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386254 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:33.386283 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386263 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:33.386283 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386272 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:33.386283 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386282 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:33.386439 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386291 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:33.386439 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386300 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:33.386439 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386310 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:33.386439 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386319 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:33.386439 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386330 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:33.386439 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.386343 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:33.388197 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.388183 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:33.388197 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.388199 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:33.389440 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.389421 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j9474" Apr 17 11:16:33.390539 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.390524 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-247.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:33.391130 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.391085 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:33.391166 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.391096 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-247.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:33.391658 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.391646 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:33.391702 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.391685 2577 server.go:1295] "Started kubelet" Apr 17 11:16:33.391810 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.391782 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:33.391866 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.391791 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:33.391912 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.391903 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:33.392604 ip-10-0-142-247 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:33.393661 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.393646 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:33.394838 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.394822 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:33.395261 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.395234 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j9474" Apr 17 11:16:33.398377 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.397328 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-247.ec2.internal.18a720c17b60a517 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-247.ec2.internal,UID:ip-10-0-142-247.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-247.ec2.internal,},FirstTimestamp:2026-04-17 11:16:33.391658263 +0000 UTC m=+0.439828142,LastTimestamp:2026-04-17 11:16:33.391658263 +0000 UTC m=+0.439828142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-247.ec2.internal,}" Apr 17 11:16:33.399511 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.399483 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:33.399982 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.399964 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:33.400755 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.400739 2577 factory.go:55] Registering systemd factory Apr 17 11:16:33.400755 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.400758 2577 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:33.400899 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.400852 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:33.400899 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.400871 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:33.400899 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.400882 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:33.401039 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.400854 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:33.401039 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.400997 2577 factory.go:153] Registering CRI-O factory Apr 17 11:16:33.401039 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.401014 2577 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:33.401039 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.401015 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:33.401205 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.401049 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:33.401205 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.401064 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:33.401205 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.401086 2577 factory.go:103] Registering Raw factory Apr 17 11:16:33.401205 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.401101 2577 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:33.401722 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.401690 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:33.402395 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.402381 2577 manager.go:319] Starting recovery of all containers Apr 17 11:16:33.410791 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.410767 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:33.413494 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.413337 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-247.ec2.internal\" not found" node="ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.414426 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.414411 2577 manager.go:324] Recovery completed Apr 17 11:16:33.415655 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.415637 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 11:16:33.418563 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.418551 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:33.420666 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.420649 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:33.420719 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.420678 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:33.420719 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.420689 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:33.421098 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.421083 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:33.421098 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.421095 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:33.421199 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.421132 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:33.423303 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.423293 2577 policy_none.go:49] "None policy: Start" Apr 17 11:16:33.423346 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.423308 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:33.423346 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.423318 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:33.463704 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.463688 2577 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.463718 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.463728 2577 server.go:85] "Starting device plugin registration server" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.463930 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.463943 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.464016 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.464071 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.464079 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.464618 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:33.477045 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.464653 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:33.532941 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.532912 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:33.534136 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.534106 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:33.534226 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.534146 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:33.534226 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.534165 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:33.534226 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.534172 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:33.534226 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.534201 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:33.536844 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.536799 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:33.564436 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.564421 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:33.565238 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.565224 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:33.565323 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.565258 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:33.565323 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.565274 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:33.565323 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.565304 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.578705 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.578688 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.578760 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.578708 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-247.ec2.internal\": node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:33.609270 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.609247 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:33.634261 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.634231 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal"] Apr 17 11:16:33.634311 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.634304 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:33.636325 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.636304 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:33.636407 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.636333 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:33.636407 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.636372 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:33.637602 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.637590 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:33.638254 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.638240 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:33.638315 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.638272 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:33.638315 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.638282 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:33.638397 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.638382 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.638430 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.638411 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:33.639029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639017 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:33.639099 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639038 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:33.639099 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639047 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:33.639366 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639354 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.639416 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639376 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:33.639962 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639943 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:33.640042 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639974 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:33.640042 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.639988 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:33.658614 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.658599 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-247.ec2.internal\" not found" node="ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.662257 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.662235 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-247.ec2.internal\" not found" node="ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.702981 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.702952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/754c4e5142ebf952f02602a3888b764e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal\" (UID: \"754c4e5142ebf952f02602a3888b764e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.703079 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.702985 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/754c4e5142ebf952f02602a3888b764e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal\" (UID: \"754c4e5142ebf952f02602a3888b764e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.703079 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.703002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c775281676decdc6d9802c88c4684de2-config\") pod \"kube-apiserver-proxy-ip-10-0-142-247.ec2.internal\" (UID: \"c775281676decdc6d9802c88c4684de2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.710037 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.710020 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:33.803826 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.803753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/754c4e5142ebf952f02602a3888b764e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal\" (UID: \"754c4e5142ebf952f02602a3888b764e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.803826 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.803792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/754c4e5142ebf952f02602a3888b764e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal\" (UID: \"754c4e5142ebf952f02602a3888b764e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.803826 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.803815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c775281676decdc6d9802c88c4684de2-config\") pod \"kube-apiserver-proxy-ip-10-0-142-247.ec2.internal\" (UID: \"c775281676decdc6d9802c88c4684de2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.803999 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.803842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/754c4e5142ebf952f02602a3888b764e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal\" (UID: \"754c4e5142ebf952f02602a3888b764e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.803999 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.803862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/754c4e5142ebf952f02602a3888b764e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal\" (UID: \"754c4e5142ebf952f02602a3888b764e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.803999 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.803846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c775281676decdc6d9802c88c4684de2-config\") pod \"kube-apiserver-proxy-ip-10-0-142-247.ec2.internal\" (UID: \"c775281676decdc6d9802c88c4684de2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.810840 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.810822 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:33.911620 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:33.911569 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:33.960755 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.960734 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:33.964172 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:33.964158 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" Apr 17 11:16:34.012178 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.012147 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.112752 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.112675 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.213259 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.213228 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.310897 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.310866 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:34.311477 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.311007 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:34.311477 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.311054 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:34.314008 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.313993 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.398176 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.398090 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:33 +0000 UTC" deadline="2027-10-20 04:59:44.641774459 +0000 UTC" Apr 17 11:16:34.398176 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.398162 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13217h43m10.243616659s" Apr 17 11:16:34.400354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.400333 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:34.410784 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.410757 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:34.414476 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.414456 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.447384 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.447360 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h55m8" Apr 17 11:16:34.457646 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.457626 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h55m8" Apr 17 11:16:34.505846 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:34.505802 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod754c4e5142ebf952f02602a3888b764e.slice/crio-b47ac34058917603267d7fa4de1fec72edc9beeea69c3578fcce6452c49d338c WatchSource:0}: Error finding container b47ac34058917603267d7fa4de1fec72edc9beeea69c3578fcce6452c49d338c: Status 404 returned error can't find the container with id b47ac34058917603267d7fa4de1fec72edc9beeea69c3578fcce6452c49d338c Apr 17 11:16:34.506377 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:34.506352 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc775281676decdc6d9802c88c4684de2.slice/crio-bdeca560ed669707ef1aeed47207873933d3e7dc47f595aafa76902d1837b7df WatchSource:0}: Error finding container bdeca560ed669707ef1aeed47207873933d3e7dc47f595aafa76902d1837b7df: Status 404 returned error can't find the container with id bdeca560ed669707ef1aeed47207873933d3e7dc47f595aafa76902d1837b7df Apr 17 11:16:34.511343 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.511328 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:34.514968 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.514949 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.537414 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.537369 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" event={"ID":"c775281676decdc6d9802c88c4684de2","Type":"ContainerStarted","Data":"bdeca560ed669707ef1aeed47207873933d3e7dc47f595aafa76902d1837b7df"} Apr 17 11:16:34.538303 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.538276 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" event={"ID":"754c4e5142ebf952f02602a3888b764e","Type":"ContainerStarted","Data":"b47ac34058917603267d7fa4de1fec72edc9beeea69c3578fcce6452c49d338c"} Apr 17 11:16:34.615876 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.615844 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.628715 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.628693 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:34.716938 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.716875 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.817378 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:34.817350 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-247.ec2.internal\" not found" Apr 17 11:16:34.879866 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.879839 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:34.901179 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.901149 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" Apr 17 11:16:34.917532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.917494 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:34.918640 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.918606 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" Apr 17 11:16:34.927461 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:34.927444 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:35.379660 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.379591 2577 apiserver.go:52] "Watching apiserver" Apr 17 11:16:35.385840 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.385816 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:35.386206 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.386179 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-46xz6","kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4","openshift-dns/node-resolver-hq6cq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal","openshift-multus/multus-additional-cni-plugins-w2d4x","openshift-multus/multus-k2xxs","openshift-network-diagnostics/network-check-target-9975j","openshift-cluster-node-tuning-operator/tuned-w6wn4","openshift-image-registry/node-ca-mbc4v","openshift-multus/network-metrics-daemon-tl874","openshift-network-operator/iptables-alerter-qdhpl","openshift-ovn-kubernetes/ovnkube-node-65xlv"] Apr 17 11:16:35.388879 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.388856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.390765 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.390743 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:35.390865 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.390839 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:35.390920 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.390891 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:35.391117 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.391093 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:35.391282 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.391157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jx9lb\"" Apr 17 11:16:35.391282 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.391212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.393064 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.392997 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h22lq\"" Apr 17 11:16:35.393064 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.393008 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:35.393064 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.393034 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:35.393259 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.393141 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:35.393259 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.393034 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:35.393470 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.393451 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.395174 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.395131 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:35.395174 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.395167 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2sgfg\"" Apr 17 11:16:35.395306 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.395230 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:35.396077 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.396058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.397592 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.397573 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:35.397860 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.397841 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-x5tfn\"" Apr 17 11:16:35.397936 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.397849 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:35.398307 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.398289 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.399902 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.399883 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:35.399991 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.399884 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gxl5d\"" Apr 17 11:16:35.400061 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.400006 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:35.400804 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.400787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:35.400898 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.400863 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:35.403449 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.403354 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.405430 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.405411 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:35.405543 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.405453 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:35.405616 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.405597 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pb47b\"" Apr 17 11:16:35.408012 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.407994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.408103 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.408078 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:35.408180 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.408165 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:35.409971 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.409950 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8pfgg\"" Apr 17 11:16:35.410284 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.410266 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:35.410388 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.410364 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:35.410449 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.410400 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:35.411177 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411157 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.411656 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-os-release\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.411745 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411661 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.411745 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411690 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-modprobe-d\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.411745 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysctl-d\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.411866 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zplcx\" (UniqueName: \"kubernetes.io/projected/cd45e2c2-be74-4898-855b-e51a00ea7a92-kube-api-access-zplcx\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.411866 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-system-cni-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.411866 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/002766c9-b94d-4afa-a980-2f7abc5b32d2-cni-binary-copy\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.411986 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysconfig\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.411986 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411944 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-kubernetes\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.411986 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.411968 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-sys\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.412095 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-lib-modules\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.412095 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412044 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-var-lib-kubelet\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.412095 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-cnibin\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412264 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-cni-bin\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412264 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.412264 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.412264 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412230 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v6d6\" (UniqueName: \"kubernetes.io/projected/2c85040c-9a42-47fe-bdd4-7a0d5418502a-kube-api-access-2v6d6\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.412264 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-os-release\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412471 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-kubelet\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412471 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412298 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2mn2\" (UniqueName: \"kubernetes.io/projected/3576c24b-1c4c-4f39-b921-a4dd5a21236e-kube-api-access-w2mn2\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.412471 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-systemd\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.412471 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cd45e2c2-be74-4898-855b-e51a00ea7a92-tmp-dir\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.412471 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-cni-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412471 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-k8s-cni-cncf-io\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412471 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-multus-certs\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c85040c-9a42-47fe-bdd4-7a0d5418502a-tmp\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-socket-dir-parent\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-netns\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-conf-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghfp\" (UniqueName: \"kubernetes.io/projected/002766c9-b94d-4afa-a980-2f7abc5b32d2-kube-api-access-9ghfp\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-socket-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:35.412775 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b74d479-a57e-4b33-8dc6-cd4321d01595-konnectivity-ca\") pod \"konnectivity-agent-46xz6\" (UID: \"1b74d479-a57e-4b33-8dc6-cd4321d01595\") " pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysctl-conf\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-hostroot\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-device-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412902 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ct2lr\"" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412934 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvsfw\" (UniqueName: \"kubernetes.io/projected/e6311d15-5f59-4e6c-8732-269f06b40c16-kube-api-access-nvsfw\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-run\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412984 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.412998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-host\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-tuned\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cd45e2c2-be74-4898-855b-e51a00ea7a92-hosts-file\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.413250 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-daemon-config\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-system-cni-dir\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-cnibin\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b74d479-a57e-4b33-8dc6-cd4321d01595-agent-certs\") pod \"konnectivity-agent-46xz6\" (UID: \"1b74d479-a57e-4b33-8dc6-cd4321d01595\") " pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-cni-multus\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-etc-kubernetes\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-registration-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-sys-fs\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.413770 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.413705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.415450 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.415408 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:35.415927 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.415795 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:35.415927 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.415873 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:35.415927 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.415888 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4fnqs\"" Apr 17 11:16:35.416101 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.415937 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:35.416101 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.415797 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:35.416101 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.415893 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:35.458941 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.458913 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:34 +0000 UTC" deadline="2027-11-25 01:37:20.23394645 +0000 UTC" Apr 17 11:16:35.458941 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.458941 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14078h20m44.775009157s" Apr 17 11:16:35.502103 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.502073 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:35.513838 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cd45e2c2-be74-4898-855b-e51a00ea7a92-tmp-dir\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.513838 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-multus-certs\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.514029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.514029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-systemd\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.514029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.514029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513950 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-multus-certs\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.514029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-host\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.514029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.513998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-netns\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.514029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-socket-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.514379 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-netns\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.514379 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.514379 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-socket-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.514379 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b74d479-a57e-4b33-8dc6-cd4321d01595-konnectivity-ca\") pod \"konnectivity-agent-46xz6\" (UID: \"1b74d479-a57e-4b33-8dc6-cd4321d01595\") " pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.514379 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cd45e2c2-be74-4898-855b-e51a00ea7a92-tmp-dir\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.514597 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.514641 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.514715 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-tuned\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.514787 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-serviceca\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.514840 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-ovn\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.514840 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.514938 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvsfw\" (UniqueName: \"kubernetes.io/projected/e6311d15-5f59-4e6c-8732-269f06b40c16-kube-api-access-nvsfw\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.514938 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.514938 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-run\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.514938 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-host\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/207babed-420b-4305-9046-6bc8fb348f3f-iptables-alerter-script\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr4qw\" (UniqueName: \"kubernetes.io/projected/207babed-420b-4305-9046-6bc8fb348f3f-kube-api-access-jr4qw\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-host\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysctl-d\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.514990 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cd45e2c2-be74-4898-855b-e51a00ea7a92-hosts-file\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515023 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-run\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.515134 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-daemon-config\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-system-cni-dir\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b74d479-a57e-4b33-8dc6-cd4321d01595-konnectivity-ca\") pod \"konnectivity-agent-46xz6\" (UID: \"1b74d479-a57e-4b33-8dc6-cd4321d01595\") " pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cd45e2c2-be74-4898-855b-e51a00ea7a92-hosts-file\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-node-log\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysctl-d\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-cni-multus\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-etc-kubernetes\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-os-release\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-cni-multus\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-system-cni-dir\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-etc-kubernetes\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-lib-modules\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovn-node-metrics-cert\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovnkube-script-lib\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-os-release\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.515518 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-kubernetes\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8pw\" (UniqueName: \"kubernetes.io/projected/0caad504-c16e-477e-b9a9-80928417640e-kube-api-access-dr8pw\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-var-lib-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-lib-modules\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-kubernetes\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-cnibin\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-cnibin\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-daemon-config\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-cni-netd\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515696 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2mn2\" (UniqueName: \"kubernetes.io/projected/3576c24b-1c4c-4f39-b921-a4dd5a21236e-kube-api-access-w2mn2\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-cni-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6311d15-5f59-4e6c-8732-269f06b40c16-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-k8s-cni-cncf-io\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.516301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjb8\" (UniqueName: \"kubernetes.io/projected/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-kube-api-access-phjb8\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515816 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-cni-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/207babed-420b-4305-9046-6bc8fb348f3f-host-slash\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-run-k8s-cni-cncf-io\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-socket-dir-parent\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-conf-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-socket-dir-parent\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghfp\" (UniqueName: \"kubernetes.io/projected/002766c9-b94d-4afa-a980-2f7abc5b32d2-kube-api-access-9ghfp\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-multus-conf-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysctl-conf\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.515981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-cni-bin\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovnkube-config\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c85040c-9a42-47fe-bdd4-7a0d5418502a-tmp\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-hostroot\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysctl-conf\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-device-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-device-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-slash\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.517060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-hostroot\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-cnibin\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b74d479-a57e-4b33-8dc6-cd4321d01595-agent-certs\") pod \"konnectivity-agent-46xz6\" (UID: \"1b74d479-a57e-4b33-8dc6-cd4321d01595\") " pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-registration-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-sys-fs\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6311d15-5f59-4e6c-8732-269f06b40c16-cnibin\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516379 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-registration-dir\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3576c24b-1c4c-4f39-b921-a4dd5a21236e-sys-fs\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-modprobe-d\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-var-lib-kubelet\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-etc-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-modprobe-d\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-log-socket\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zplcx\" (UniqueName: \"kubernetes.io/projected/cd45e2c2-be74-4898-855b-e51a00ea7a92-kube-api-access-zplcx\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-var-lib-kubelet\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.517662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-system-cni-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-system-cni-dir\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/002766c9-b94d-4afa-a980-2f7abc5b32d2-cni-binary-copy\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysconfig\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-sys\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v6d6\" (UniqueName: \"kubernetes.io/projected/2c85040c-9a42-47fe-bdd4-7a0d5418502a-kube-api-access-2v6d6\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-sysconfig\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-sys\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.516928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-systemd-units\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/002766c9-b94d-4afa-a980-2f7abc5b32d2-cni-binary-copy\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-cni-bin\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-cni-bin\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-run-ovn-kubernetes\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-env-overrides\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-os-release\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517932 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-kubelet\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-systemd\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.517982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-kubelet\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.518575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-run-netns\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.519576 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-os-release\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.519576 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682vs\" (UniqueName: \"kubernetes.io/projected/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-kube-api-access-682vs\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.519576 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-systemd\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.519576 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518152 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/002766c9-b94d-4afa-a980-2f7abc5b32d2-host-var-lib-kubelet\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.519576 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c85040c-9a42-47fe-bdd4-7a0d5418502a-etc-tuned\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.519576 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c85040c-9a42-47fe-bdd4-7a0d5418502a-tmp\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.519576 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.518934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b74d479-a57e-4b33-8dc6-cd4321d01595-agent-certs\") pod \"konnectivity-agent-46xz6\" (UID: \"1b74d479-a57e-4b33-8dc6-cd4321d01595\") " pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.527187 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.527171 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:35.527187 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.527192 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:35.527325 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.527201 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2z28j for pod openshift-network-diagnostics/network-check-target-9975j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:35.527325 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.527276 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j podName:08f32961-6393-4bcc-a8bf-c27e9df01e0e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:36.027244255 +0000 UTC m=+3.075414145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2z28j" (UniqueName: "kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j") pod "network-check-target-9975j" (UID: "08f32961-6393-4bcc-a8bf-c27e9df01e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:35.532481 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.530400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v6d6\" (UniqueName: \"kubernetes.io/projected/2c85040c-9a42-47fe-bdd4-7a0d5418502a-kube-api-access-2v6d6\") pod \"tuned-w6wn4\" (UID: \"2c85040c-9a42-47fe-bdd4-7a0d5418502a\") " pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.532481 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.530419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zplcx\" (UniqueName: \"kubernetes.io/projected/cd45e2c2-be74-4898-855b-e51a00ea7a92-kube-api-access-zplcx\") pod \"node-resolver-hq6cq\" (UID: \"cd45e2c2-be74-4898-855b-e51a00ea7a92\") " pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.532481 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.530437 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2mn2\" (UniqueName: \"kubernetes.io/projected/3576c24b-1c4c-4f39-b921-a4dd5a21236e-kube-api-access-w2mn2\") pod \"aws-ebs-csi-driver-node-4mbt4\" (UID: \"3576c24b-1c4c-4f39-b921-a4dd5a21236e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.532481 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.530855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvsfw\" (UniqueName: \"kubernetes.io/projected/e6311d15-5f59-4e6c-8732-269f06b40c16-kube-api-access-nvsfw\") pod \"multus-additional-cni-plugins-w2d4x\" (UID: \"e6311d15-5f59-4e6c-8732-269f06b40c16\") " pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.533171 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.533151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghfp\" (UniqueName: \"kubernetes.io/projected/002766c9-b94d-4afa-a980-2f7abc5b32d2-kube-api-access-9ghfp\") pod \"multus-k2xxs\" (UID: \"002766c9-b94d-4afa-a980-2f7abc5b32d2\") " pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.618960 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.618924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.618972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-etc-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.618994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-log-socket\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-systemd-units\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.619040 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-etc-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-run-ovn-kubernetes\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619122 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-log-socket\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-run-ovn-kubernetes\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619149 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:35.619135 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:36.119100419 +0000 UTC m=+3.167270305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-env-overrides\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-kubelet\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-systemd-units\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-run-netns\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-682vs\" (UniqueName: \"kubernetes.io/projected/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-kube-api-access-682vs\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-systemd\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-run-netns\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-host\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-systemd\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-serviceca\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-ovn\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-host\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/207babed-420b-4305-9046-6bc8fb348f3f-iptables-alerter-script\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr4qw\" (UniqueName: \"kubernetes.io/projected/207babed-420b-4305-9046-6bc8fb348f3f-kube-api-access-jr4qw\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-run-ovn\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.619532 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-kubelet\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-node-log\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovn-node-metrics-cert\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovnkube-script-lib\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8pw\" (UniqueName: \"kubernetes.io/projected/0caad504-c16e-477e-b9a9-80928417640e-kube-api-access-dr8pw\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-node-log\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-env-overrides\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-var-lib-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-serviceca\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-cni-netd\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-var-lib-openvswitch\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phjb8\" (UniqueName: \"kubernetes.io/projected/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-kube-api-access-phjb8\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-cni-netd\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/207babed-420b-4305-9046-6bc8fb348f3f-host-slash\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/207babed-420b-4305-9046-6bc8fb348f3f-host-slash\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.620408 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-cni-bin\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.621160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovnkube-config\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.621160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.619976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-slash\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.621160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.620004 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-cni-bin\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.621160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.620010 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/207babed-420b-4305-9046-6bc8fb348f3f-iptables-alerter-script\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.621160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.620055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-host-slash\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.621160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.620201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovnkube-script-lib\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.621160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.620428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovnkube-config\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.622123 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.622089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-ovn-node-metrics-cert\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.635546 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.635517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-682vs\" (UniqueName: \"kubernetes.io/projected/e5e299b9-9fdc-4122-ab7a-5d4a2753c88e-kube-api-access-682vs\") pod \"ovnkube-node-65xlv\" (UID: \"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e\") " pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.635678 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.635621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr4qw\" (UniqueName: \"kubernetes.io/projected/207babed-420b-4305-9046-6bc8fb348f3f-kube-api-access-jr4qw\") pod \"iptables-alerter-qdhpl\" (UID: \"207babed-420b-4305-9046-6bc8fb348f3f\") " pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.636856 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.636830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjb8\" (UniqueName: \"kubernetes.io/projected/1ad8d75d-d9b5-45d3-ada0-68b4d648c30f-kube-api-access-phjb8\") pod \"node-ca-mbc4v\" (UID: \"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f\") " pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.637231 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.637207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8pw\" (UniqueName: \"kubernetes.io/projected/0caad504-c16e-477e-b9a9-80928417640e-kube-api-access-dr8pw\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:35.700035 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.700007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k2xxs" Apr 17 11:16:35.707953 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.707931 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" Apr 17 11:16:35.721617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.721598 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hq6cq" Apr 17 11:16:35.726143 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.726123 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mbc4v" Apr 17 11:16:35.731757 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.731732 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" Apr 17 11:16:35.738653 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.738633 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:35.744175 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.744155 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" Apr 17 11:16:35.750677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.750658 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qdhpl" Apr 17 11:16:35.752264 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.752248 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:35.783844 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:35.783791 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:36.123524 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.123444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:36.123524 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.123490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:36.123743 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:36.123626 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:36.123743 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:36.123630 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:36.123743 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:36.123658 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:36.123743 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:36.123673 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2z28j for pod openshift-network-diagnostics/network-check-target-9975j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:36.123743 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:36.123692 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:37.123674323 +0000 UTC m=+4.171844189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:36.123743 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:36.123735 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j podName:08f32961-6393-4bcc-a8bf-c27e9df01e0e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:37.123723837 +0000 UTC m=+4.171893704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2z28j" (UniqueName: "kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j") pod "network-check-target-9975j" (UID: "08f32961-6393-4bcc-a8bf-c27e9df01e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:36.143805 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:36.143778 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod207babed_420b_4305_9046_6bc8fb348f3f.slice/crio-0cbceab9857e51a44b805e39598db1e30301e91b41e3c5cb106f7f4e9607e673 WatchSource:0}: Error finding container 0cbceab9857e51a44b805e39598db1e30301e91b41e3c5cb106f7f4e9607e673: Status 404 returned error can't find the container with id 0cbceab9857e51a44b805e39598db1e30301e91b41e3c5cb106f7f4e9607e673 Apr 17 11:16:36.145950 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:36.145918 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74d479_a57e_4b33_8dc6_cd4321d01595.slice/crio-d7e38717f37a054bb25b3df6a16575c1330c716ed289c19d01d9273b6a423d2e WatchSource:0}: Error finding container d7e38717f37a054bb25b3df6a16575c1330c716ed289c19d01d9273b6a423d2e: Status 404 returned error can't find the container with id d7e38717f37a054bb25b3df6a16575c1330c716ed289c19d01d9273b6a423d2e Apr 17 11:16:36.147097 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:36.147069 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6311d15_5f59_4e6c_8732_269f06b40c16.slice/crio-d2f454e8ca96ead40494aac46cd8bec4b5c0b3675b62f0fb1759b4477c50a94a WatchSource:0}: Error finding container d2f454e8ca96ead40494aac46cd8bec4b5c0b3675b62f0fb1759b4477c50a94a: Status 404 returned error can't find the container with id d2f454e8ca96ead40494aac46cd8bec4b5c0b3675b62f0fb1759b4477c50a94a Apr 17 11:16:36.150101 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:36.150080 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod002766c9_b94d_4afa_a980_2f7abc5b32d2.slice/crio-beaabc8bb52394310380ddfb8beb68d137bc55341a0b504ac1120c631970b6f1 WatchSource:0}: Error finding container beaabc8bb52394310380ddfb8beb68d137bc55341a0b504ac1120c631970b6f1: Status 404 returned error can't find the container with id beaabc8bb52394310380ddfb8beb68d137bc55341a0b504ac1120c631970b6f1 Apr 17 11:16:36.150930 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:36.150905 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e299b9_9fdc_4122_ab7a_5d4a2753c88e.slice/crio-7a26c9eb41eec9b74dc04e5fc8158f61ff0911794c00d8e6c0fe703bc53b56fe WatchSource:0}: Error finding container 7a26c9eb41eec9b74dc04e5fc8158f61ff0911794c00d8e6c0fe703bc53b56fe: Status 404 returned error can't find the container with id 7a26c9eb41eec9b74dc04e5fc8158f61ff0911794c00d8e6c0fe703bc53b56fe Apr 17 11:16:36.151780 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:36.151700 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd45e2c2_be74_4898_855b_e51a00ea7a92.slice/crio-cb30ced021fec692ac86d89866e89613799b1ceb0985e082a34d328817b65fb2 WatchSource:0}: Error finding container cb30ced021fec692ac86d89866e89613799b1ceb0985e082a34d328817b65fb2: Status 404 returned error can't find the container with id cb30ced021fec692ac86d89866e89613799b1ceb0985e082a34d328817b65fb2 Apr 17 11:16:36.152327 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:16:36.152294 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad8d75d_d9b5_45d3_ada0_68b4d648c30f.slice/crio-b27a6f53d5752b9ea5a4230db59aceaf419221ec235f394cd241ff5a4b43d3aa WatchSource:0}: Error finding container b27a6f53d5752b9ea5a4230db59aceaf419221ec235f394cd241ff5a4b43d3aa: Status 404 returned error can't find the container with id b27a6f53d5752b9ea5a4230db59aceaf419221ec235f394cd241ff5a4b43d3aa Apr 17 11:16:36.459906 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.459678 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:34 +0000 UTC" deadline="2027-11-21 01:17:20.039239861 +0000 UTC" Apr 17 11:16:36.459906 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.459865 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13982h0m43.579379062s" Apr 17 11:16:36.547253 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.547182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" event={"ID":"3576c24b-1c4c-4f39-b921-a4dd5a21236e","Type":"ContainerStarted","Data":"14d296010d834bf47f961caa34af94a117c63a838cd12fea3a545cac4db6e9ec"} Apr 17 11:16:36.549382 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.549346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" event={"ID":"2c85040c-9a42-47fe-bdd4-7a0d5418502a","Type":"ContainerStarted","Data":"ab08876b238ebaad96667188240fa81c8dc7b25b59f2a64f191b2ac7b1783326"} Apr 17 11:16:36.556387 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.553994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hq6cq" event={"ID":"cd45e2c2-be74-4898-855b-e51a00ea7a92","Type":"ContainerStarted","Data":"cb30ced021fec692ac86d89866e89613799b1ceb0985e082a34d328817b65fb2"} Apr 17 11:16:36.563415 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.563351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xxs" event={"ID":"002766c9-b94d-4afa-a980-2f7abc5b32d2","Type":"ContainerStarted","Data":"beaabc8bb52394310380ddfb8beb68d137bc55341a0b504ac1120c631970b6f1"} Apr 17 11:16:36.567443 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.567389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qdhpl" event={"ID":"207babed-420b-4305-9046-6bc8fb348f3f","Type":"ContainerStarted","Data":"0cbceab9857e51a44b805e39598db1e30301e91b41e3c5cb106f7f4e9607e673"} Apr 17 11:16:36.569910 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.569862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mbc4v" event={"ID":"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f","Type":"ContainerStarted","Data":"b27a6f53d5752b9ea5a4230db59aceaf419221ec235f394cd241ff5a4b43d3aa"} Apr 17 11:16:36.571248 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.571189 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"7a26c9eb41eec9b74dc04e5fc8158f61ff0911794c00d8e6c0fe703bc53b56fe"} Apr 17 11:16:36.575166 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.575088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerStarted","Data":"d2f454e8ca96ead40494aac46cd8bec4b5c0b3675b62f0fb1759b4477c50a94a"} Apr 17 11:16:36.578732 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.578650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-46xz6" event={"ID":"1b74d479-a57e-4b33-8dc6-cd4321d01595","Type":"ContainerStarted","Data":"d7e38717f37a054bb25b3df6a16575c1330c716ed289c19d01d9273b6a423d2e"} Apr 17 11:16:36.590588 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:36.589898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" event={"ID":"c775281676decdc6d9802c88c4684de2","Type":"ContainerStarted","Data":"fbc1223eacb5368490a7f0355d0b1813961c16d32a1c4149074c9de89c4fc5ec"} Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:37.130873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:37.130924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.131064 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.131089 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.131104 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.131152 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2z28j for pod openshift-network-diagnostics/network-check-target-9975j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.131139 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:39.131106319 +0000 UTC m=+6.179276208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:37.131233 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.131194 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j podName:08f32961-6393-4bcc-a8bf-c27e9df01e0e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:39.131183648 +0000 UTC m=+6.179353528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2z28j" (UniqueName: "kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j") pod "network-check-target-9975j" (UID: "08f32961-6393-4bcc-a8bf-c27e9df01e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:37.537651 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:37.537183 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:37.537651 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.537305 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:37.538453 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:37.538281 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:37.538453 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:37.538403 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:37.603811 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:37.603773 2577 generic.go:358] "Generic (PLEG): container finished" podID="754c4e5142ebf952f02602a3888b764e" containerID="5119629da89b6245e810483b207b9134f5e52241bc2a74a11a84d5f69715b500" exitCode=0 Apr 17 11:16:37.604549 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:37.604298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" event={"ID":"754c4e5142ebf952f02602a3888b764e","Type":"ContainerDied","Data":"5119629da89b6245e810483b207b9134f5e52241bc2a74a11a84d5f69715b500"} Apr 17 11:16:37.617968 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:37.616983 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-247.ec2.internal" podStartSLOduration=3.616966418 podStartE2EDuration="3.616966418s" podCreationTimestamp="2026-04-17 11:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:36.604213074 +0000 UTC m=+3.652382964" watchObservedRunningTime="2026-04-17 11:16:37.616966418 +0000 UTC m=+4.665136307" Apr 17 11:16:38.615447 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:38.615411 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" event={"ID":"754c4e5142ebf952f02602a3888b764e","Type":"ContainerStarted","Data":"47ff79b60eb0b275778425003452579d9c7535d4eab082eb5fd5d56c49060e80"} Apr 17 11:16:39.145932 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:39.145892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:39.146159 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:39.145953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:39.146159 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.146134 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:39.146291 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.146199 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:43.146180087 +0000 UTC m=+10.194349955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:39.146583 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.146463 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:39.146583 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.146492 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:39.146583 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.146506 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2z28j for pod openshift-network-diagnostics/network-check-target-9975j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:39.146583 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.146560 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j podName:08f32961-6393-4bcc-a8bf-c27e9df01e0e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:43.146543867 +0000 UTC m=+10.194713734 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2z28j" (UniqueName: "kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j") pod "network-check-target-9975j" (UID: "08f32961-6393-4bcc-a8bf-c27e9df01e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:39.537726 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:39.537652 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:39.537890 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.537791 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:39.537951 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:39.537910 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:39.538083 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:39.538036 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:41.535354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:41.534853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:41.535354 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:41.534973 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:41.535354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:41.535144 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:41.535354 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:41.535267 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:43.176489 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:43.176451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:43.176942 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:43.176506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:43.176942 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.176631 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:43.176942 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.176690 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.176671727 +0000 UTC m=+18.224841594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:43.177165 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.177093 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:43.177165 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.177126 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:43.177165 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.177139 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2z28j for pod openshift-network-diagnostics/network-check-target-9975j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:43.177327 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.177194 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j podName:08f32961-6393-4bcc-a8bf-c27e9df01e0e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.177177225 +0000 UTC m=+18.225347096 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2z28j" (UniqueName: "kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j") pod "network-check-target-9975j" (UID: "08f32961-6393-4bcc-a8bf-c27e9df01e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:43.536161 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:43.535646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:43.536161 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:43.535663 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:43.536161 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.535767 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:43.536161 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:43.535818 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:45.534457 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:45.534379 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:45.534862 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:45.534525 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:45.534862 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:45.534577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:45.534862 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:45.534709 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:47.535236 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:47.535202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:47.535741 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:47.535316 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:47.535741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:47.535406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:47.535741 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:47.535557 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:49.535371 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:49.535340 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:49.535792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:49.535350 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:49.535792 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:49.535463 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:49.535792 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:49.535567 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:51.237516 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:51.237474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:51.238078 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:51.237528 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:51.238078 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.237638 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:51.238078 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.237648 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:51.238078 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.237675 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:51.238078 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.237690 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2z28j for pod openshift-network-diagnostics/network-check-target-9975j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:51.238078 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.237711 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:07.237692179 +0000 UTC m=+34.285862052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:51.238078 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.237740 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j podName:08f32961-6393-4bcc-a8bf-c27e9df01e0e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:07.237726988 +0000 UTC m=+34.285896868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2z28j" (UniqueName: "kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j") pod "network-check-target-9975j" (UID: "08f32961-6393-4bcc-a8bf-c27e9df01e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:51.535036 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:51.534955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:51.535209 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.535085 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:51.535209 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:51.535143 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:51.535323 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:51.535253 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:53.536060 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.535783 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:53.536662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.535886 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:53.536662 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:53.536139 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:53.536662 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:53.536232 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:53.640203 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.640152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mbc4v" event={"ID":"1ad8d75d-d9b5-45d3-ada0-68b4d648c30f","Type":"ContainerStarted","Data":"ab40d051eac5773bfffa341fc4de1e22af1edbc295693a7a4e429894ee03ca44"} Apr 17 11:16:53.642773 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.642749 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:16:53.643068 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.643046 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5e299b9-9fdc-4122-ab7a-5d4a2753c88e" containerID="7199b04b3a5f81b252a66e2a349b20cc4c6c86ce74311b9d0dd0371c5824a800" exitCode=1 Apr 17 11:16:53.643190 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.643131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"8fdd3b667b0f6f9f75ae17021219c18325e06797f3608e288789f96734924c1d"} Apr 17 11:16:53.643190 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.643160 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"4c3e0a9f3d09445ba313c9123b435bc90c2030a28bf69adb38c27498d0fbdd22"} Apr 17 11:16:53.643190 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.643173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"9ad5357e45b876683d2c30355864d2a9226802b591d0c67b93d41665e314b2d9"} Apr 17 11:16:53.643190 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.643185 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerDied","Data":"7199b04b3a5f81b252a66e2a349b20cc4c6c86ce74311b9d0dd0371c5824a800"} Apr 17 11:16:53.643364 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.643199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"3408916bda2e84729b6e4920b0fef2ec05304cdd9e4276868ce4d28f341196e3"} Apr 17 11:16:53.644398 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.644375 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6311d15-5f59-4e6c-8732-269f06b40c16" containerID="979f073684bb0bb8bda33871992af683bccbecdea24e7a5622a21bdd8e72abeb" exitCode=0 Apr 17 11:16:53.644487 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.644402 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerDied","Data":"979f073684bb0bb8bda33871992af683bccbecdea24e7a5622a21bdd8e72abeb"} Apr 17 11:16:53.646204 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.646162 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-46xz6" event={"ID":"1b74d479-a57e-4b33-8dc6-cd4321d01595","Type":"ContainerStarted","Data":"219bc6670e7a2a72248b3502b110dc8e781071bdb7952e587797cfa5b2cbd91e"} Apr 17 11:16:53.647497 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.647466 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" event={"ID":"3576c24b-1c4c-4f39-b921-a4dd5a21236e","Type":"ContainerStarted","Data":"ea478c51a24e58b1b9d39d39990618059497e98f30312a5cc7f7e144c014f3ff"} Apr 17 11:16:53.648776 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.648753 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" event={"ID":"2c85040c-9a42-47fe-bdd4-7a0d5418502a","Type":"ContainerStarted","Data":"9d9bdbace5219a6c76059abee38df241552ae467d3d7e37f388fa5b61629a646"} Apr 17 11:16:53.650135 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.650099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hq6cq" event={"ID":"cd45e2c2-be74-4898-855b-e51a00ea7a92","Type":"ContainerStarted","Data":"37f92eccd6093b261a25ac92b0e5b716527fd786117025c24d8e5e9cf8024c3b"} Apr 17 11:16:53.651596 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.651518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xxs" event={"ID":"002766c9-b94d-4afa-a980-2f7abc5b32d2","Type":"ContainerStarted","Data":"787d7e9275edbbbdcf9f4fa3726e3e617edfbf1d162cfd1ac2b661fb2a577d68"} Apr 17 11:16:53.654792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.654757 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mbc4v" podStartSLOduration=3.861997043 podStartE2EDuration="20.654746292s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.157474803 +0000 UTC m=+3.205644673" lastFinishedPulling="2026-04-17 11:16:52.950224043 +0000 UTC m=+19.998393922" observedRunningTime="2026-04-17 11:16:53.654612234 +0000 UTC m=+20.702782122" watchObservedRunningTime="2026-04-17 11:16:53.654746292 +0000 UTC m=+20.702916179" Apr 17 11:16:53.655314 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.655284 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-247.ec2.internal" podStartSLOduration=19.655277448 podStartE2EDuration="19.655277448s" podCreationTimestamp="2026-04-17 11:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:38.632696581 +0000 UTC m=+5.680866471" watchObservedRunningTime="2026-04-17 11:16:53.655277448 +0000 UTC m=+20.703447337" Apr 17 11:16:53.676744 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.676711 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k2xxs" podStartSLOduration=3.810114896 podStartE2EDuration="20.67670193s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.151980632 +0000 UTC m=+3.200150499" lastFinishedPulling="2026-04-17 11:16:53.018567662 +0000 UTC m=+20.066737533" observedRunningTime="2026-04-17 11:16:53.676328198 +0000 UTC m=+20.724498282" watchObservedRunningTime="2026-04-17 11:16:53.67670193 +0000 UTC m=+20.724871817" Apr 17 11:16:53.696005 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.695954 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hq6cq" podStartSLOduration=3.89992525 podStartE2EDuration="20.695938835s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.15449348 +0000 UTC m=+3.202663349" lastFinishedPulling="2026-04-17 11:16:52.950507057 +0000 UTC m=+19.998676934" observedRunningTime="2026-04-17 11:16:53.695149311 +0000 UTC m=+20.743319201" watchObservedRunningTime="2026-04-17 11:16:53.695938835 +0000 UTC m=+20.744108724" Apr 17 11:16:53.735063 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.735004 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-46xz6" podStartSLOduration=11.763674219 podStartE2EDuration="20.734987437s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.148342914 +0000 UTC m=+3.196512780" lastFinishedPulling="2026-04-17 11:16:45.119656114 +0000 UTC m=+12.167825998" observedRunningTime="2026-04-17 11:16:53.734940683 +0000 UTC m=+20.783110570" watchObservedRunningTime="2026-04-17 11:16:53.734987437 +0000 UTC m=+20.783157339" Apr 17 11:16:53.756763 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:53.756721 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-w6wn4" podStartSLOduration=3.961636705 podStartE2EDuration="20.756708669s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.157849416 +0000 UTC m=+3.206019286" lastFinishedPulling="2026-04-17 11:16:52.952921379 +0000 UTC m=+20.001091250" observedRunningTime="2026-04-17 11:16:53.756273691 +0000 UTC m=+20.804443578" watchObservedRunningTime="2026-04-17 11:16:53.756708669 +0000 UTC m=+20.804878556" Apr 17 11:16:54.545085 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:54.545045 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:54.655608 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:54.655519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" event={"ID":"3576c24b-1c4c-4f39-b921-a4dd5a21236e","Type":"ContainerStarted","Data":"30f7dad17daa27afd18f71d83b6be2a7662cbdf67a7ba2e79dd7627b9f7bf585"} Apr 17 11:16:54.657005 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:54.656954 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qdhpl" event={"ID":"207babed-420b-4305-9046-6bc8fb348f3f","Type":"ContainerStarted","Data":"b18fd2aaac0275ca93854605f169aab80587159b6b286b3ce29c6d000894fe76"} Apr 17 11:16:54.659652 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:54.659630 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:16:54.660024 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:54.659946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"0b97d0f1397f8fcaef2b519f4e5d383b9233adb2945d92ba3467454e7be95f66"} Apr 17 11:16:54.673349 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:54.673272 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qdhpl" podStartSLOduration=4.984209985 podStartE2EDuration="21.673256187s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.145706231 +0000 UTC m=+3.193876096" lastFinishedPulling="2026-04-17 11:16:52.834752431 +0000 UTC m=+19.882922298" observedRunningTime="2026-04-17 11:16:54.67258451 +0000 UTC m=+21.720754440" watchObservedRunningTime="2026-04-17 11:16:54.673256187 +0000 UTC m=+21.721426077" Apr 17 11:16:55.474075 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:55.473988 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:54.545065465Z","UUID":"52188adc-6fb0-41b4-960f-e84d5cf996bc","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:55.474291 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:55.474225 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:55.474839 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:55.474821 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:16:55.475899 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:55.475884 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:55.475983 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:55.475905 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:55.534665 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:55.534581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:55.534802 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:55.534703 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:55.534802 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:55.534762 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:55.534898 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:55.534885 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:56.667171 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:56.666965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:16:56.667635 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:56.667541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"f0c131ab50e7a45deafafadd22039f6eb376d65bbfd22105fdf65fe70419a255"} Apr 17 11:16:56.669406 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:56.669379 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" event={"ID":"3576c24b-1c4c-4f39-b921-a4dd5a21236e","Type":"ContainerStarted","Data":"83d92189d4bc00af995d0638491260d7407285e060e1245af1acfd711bb7bd76"} Apr 17 11:16:56.669520 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:56.669419 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:56.686429 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:56.686386 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mbt4" podStartSLOduration=4.162027545 podStartE2EDuration="23.686372649s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.159203076 +0000 UTC m=+3.207372946" lastFinishedPulling="2026-04-17 11:16:55.683548171 +0000 UTC m=+22.731718050" observedRunningTime="2026-04-17 11:16:56.686253018 +0000 UTC m=+23.734422899" watchObservedRunningTime="2026-04-17 11:16:56.686372649 +0000 UTC m=+23.734542539" Apr 17 11:16:57.538436 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:57.538411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:57.538630 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:57.538415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:57.538630 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:57.538550 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:57.538630 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:57.538617 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:58.676076 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:58.675819 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:16:58.676787 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:58.676287 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"9e5db8fe8a3266bacd1fafe3d65cefc33b347513d3d1d3b0e14a2637edb6e9af"} Apr 17 11:16:58.676787 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:58.676619 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:58.676787 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:58.676713 2577 scope.go:117] "RemoveContainer" containerID="7199b04b3a5f81b252a66e2a349b20cc4c6c86ce74311b9d0dd0371c5824a800" Apr 17 11:16:58.678181 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:58.678156 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6311d15-5f59-4e6c-8732-269f06b40c16" containerID="9df77433552cd130d5fd426f8ae7a5635a8bf3cadd28add5e52aae80409dfbd8" exitCode=0 Apr 17 11:16:58.678300 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:58.678186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerDied","Data":"9df77433552cd130d5fd426f8ae7a5635a8bf3cadd28add5e52aae80409dfbd8"} Apr 17 11:16:58.691835 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:58.691818 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:59.534922 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.534893 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:16:59.535088 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:59.534985 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:16:59.535088 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.535069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:16:59.535218 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:16:59.535198 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:16:59.683219 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.683198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:16:59.683617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.683521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" event={"ID":"e5e299b9-9fdc-4122-ab7a-5d4a2753c88e","Type":"ContainerStarted","Data":"d6ac761c61720336db09c58872282b9f63b5e82b0f010e6122c03a8fdf08ea46"} Apr 17 11:16:59.683656 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.683639 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:59.683881 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.683864 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:59.699325 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.699293 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:16:59.721144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:16:59.721056 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" podStartSLOduration=9.837586342 podStartE2EDuration="26.721041956s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.153951603 +0000 UTC m=+3.202121475" lastFinishedPulling="2026-04-17 11:16:53.037407217 +0000 UTC m=+20.085577089" observedRunningTime="2026-04-17 11:16:59.720038201 +0000 UTC m=+26.768208079" watchObservedRunningTime="2026-04-17 11:16:59.721041956 +0000 UTC m=+26.769211846" Apr 17 11:17:00.135680 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:00.135618 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9975j"] Apr 17 11:17:00.135809 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:00.135722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:00.135809 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:00.135793 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:17:00.138238 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:00.138206 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tl874"] Apr 17 11:17:00.138342 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:00.138297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:17:00.138385 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:00.138368 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:17:00.687291 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:00.687258 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6311d15-5f59-4e6c-8732-269f06b40c16" containerID="a4856934ea01f444bb300c1272a82f03854a63ad4f4b701d44ba6f30b18078a7" exitCode=0 Apr 17 11:17:00.687704 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:00.687344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerDied","Data":"a4856934ea01f444bb300c1272a82f03854a63ad4f4b701d44ba6f30b18078a7"} Apr 17 11:17:00.687704 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:00.687436 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:17:01.241500 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:01.241473 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:17:01.241680 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:01.241616 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:17:01.242195 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:01.242170 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-46xz6" Apr 17 11:17:01.534512 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:01.534300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:01.534646 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:01.534300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:17:01.534646 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:01.534557 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:17:01.534646 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:01.534627 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:17:01.689164 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:01.689139 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:17:02.695300 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:02.695266 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6311d15-5f59-4e6c-8732-269f06b40c16" containerID="ac354dd4d2dbd21768ac488ec3749a83de8ad56c2d6308aa99ef2d17523a6e97" exitCode=0 Apr 17 11:17:02.695671 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:02.695321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerDied","Data":"ac354dd4d2dbd21768ac488ec3749a83de8ad56c2d6308aa99ef2d17523a6e97"} Apr 17 11:17:03.536239 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:03.536206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:03.536239 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:03.536232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:17:03.536462 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:03.536317 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:17:03.536462 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:03.536427 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:17:04.697925 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:04.697893 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:17:04.698666 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:04.698307 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:17:04.713842 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:04.713817 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65xlv" Apr 17 11:17:05.534522 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.534488 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:05.534698 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:05.534613 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9975j" podUID="08f32961-6393-4bcc-a8bf-c27e9df01e0e" Apr 17 11:17:05.534754 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.534699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:17:05.534837 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:05.534816 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:17:05.778915 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.778886 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-247.ec2.internal" event="NodeReady" Apr 17 11:17:05.779292 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.779019 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:17:05.837822 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.837793 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nd4tb"] Apr 17 11:17:05.862310 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.862289 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2n6g2"] Apr 17 11:17:05.862463 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.862419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:05.864500 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.864468 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:17:05.864664 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.864646 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:17:05.864742 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.864653 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5xzqf\"" Apr 17 11:17:05.864788 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.864738 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:17:05.881041 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.880974 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nd4tb"] Apr 17 11:17:05.881155 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.881061 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2n6g2"] Apr 17 11:17:05.881155 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.881068 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:05.883063 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.883046 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:17:05.883159 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.883092 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:17:05.883212 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.883187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v5zg9\"" Apr 17 11:17:05.953903 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.953868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:05.954078 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:05.953916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pdg5\" (UniqueName: \"kubernetes.io/projected/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-kube-api-access-9pdg5\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:06.055046 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.055014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-config-volume\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.055046 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.055052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-tmp-dir\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.055311 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.055069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz5xp\" (UniqueName: \"kubernetes.io/projected/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-kube-api-access-hz5xp\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.055311 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.055199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:06.055311 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.055238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pdg5\" (UniqueName: \"kubernetes.io/projected/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-kube-api-access-9pdg5\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:06.055451 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.055400 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:06.055499 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.055444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.055499 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.055463 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:06.555443665 +0000 UTC m=+33.603613535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:17:06.067788 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.067751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pdg5\" (UniqueName: \"kubernetes.io/projected/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-kube-api-access-9pdg5\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:06.156516 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.156432 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.156516 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.156515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-config-volume\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.156739 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.156541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-tmp-dir\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.156739 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.156567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz5xp\" (UniqueName: \"kubernetes.io/projected/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-kube-api-access-hz5xp\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.156739 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.156604 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:06.156739 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.156688 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:06.656666392 +0000 UTC m=+33.704836262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:17:06.156983 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.156963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-tmp-dir\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.157160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.157142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-config-volume\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.166050 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.166024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz5xp\" (UniqueName: \"kubernetes.io/projected/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-kube-api-access-hz5xp\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.559899 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.559818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:06.560036 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.559977 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:06.560078 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.560036 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:07.560021642 +0000 UTC m=+34.608191508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:17:06.660215 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:06.660183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:06.660399 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.660324 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:06.660399 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:06.660386 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:07.660371006 +0000 UTC m=+34.708540872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:17:07.263802 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.263760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:07.264449 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.263817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:17:07.264449 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.263931 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:17:07.264449 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.264001 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:39.263981546 +0000 UTC m=+66.312151425 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:17:07.264449 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.263933 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:17:07.264449 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.264043 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:17:07.264449 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.264057 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2z28j for pod openshift-network-diagnostics/network-check-target-9975j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:17:07.264449 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.264101 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j podName:08f32961-6393-4bcc-a8bf-c27e9df01e0e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:39.264088216 +0000 UTC m=+66.312258086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2z28j" (UniqueName: "kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j") pod "network-check-target-9975j" (UID: "08f32961-6393-4bcc-a8bf-c27e9df01e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:17:07.535495 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.535412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:07.535495 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.535460 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:17:07.537922 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.537897 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:07.538548 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.538479 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:07.538661 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.538560 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ntppc\"" Apr 17 11:17:07.538661 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.538570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:07.538762 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.538696 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gg44j\"" Apr 17 11:17:07.566239 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.566215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:07.566374 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.566358 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:07.566452 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.566440 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:09.566418936 +0000 UTC m=+36.614588814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:17:07.666945 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:07.666905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:07.667138 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.667100 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:07.667206 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:07.667184 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:09.667165306 +0000 UTC m=+36.715335173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:17:08.710076 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:08.709890 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerStarted","Data":"69537d0cbd4d1581769702cdbab353e2a8437b5ea60354d6307d49e06f5e4fa9"} Apr 17 11:17:09.007070 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.006996 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f"] Apr 17 11:17:09.018177 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.018154 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.020231 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.020206 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:17:09.020802 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.020783 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:17:09.021429 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.021414 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 11:17:09.021504 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.021417 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:17:09.024563 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.024535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f"] Apr 17 11:17:09.058636 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.058615 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s"] Apr 17 11:17:09.062772 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.062758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.065228 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.065202 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 11:17:09.065331 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.065270 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 11:17:09.065331 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.065293 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 11:17:09.065439 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.065297 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 11:17:09.070585 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.070567 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s"] Apr 17 11:17:09.176743 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.176714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsbdc\" (UniqueName: \"kubernetes.io/projected/221aef81-7106-48e5-8081-4ba125428409-kube-api-access-vsbdc\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.176743 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.176745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.176946 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.176776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdw5\" (UniqueName: \"kubernetes.io/projected/5b6788a2-861c-4390-b69b-1c2415577459-kube-api-access-2mdw5\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.176946 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.176882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/221aef81-7106-48e5-8081-4ba125428409-tmp\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.176946 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.176911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5b6788a2-861c-4390-b69b-1c2415577459-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.176946 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.176935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-ca\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.177164 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.176998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-hub\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.177164 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.177034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/221aef81-7106-48e5-8081-4ba125428409-klusterlet-config\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.177164 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.177078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdw5\" (UniqueName: \"kubernetes.io/projected/5b6788a2-861c-4390-b69b-1c2415577459-kube-api-access-2mdw5\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/221aef81-7106-48e5-8081-4ba125428409-tmp\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5b6788a2-861c-4390-b69b-1c2415577459-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-ca\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-hub\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/221aef81-7106-48e5-8081-4ba125428409-klusterlet-config\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsbdc\" (UniqueName: \"kubernetes.io/projected/221aef81-7106-48e5-8081-4ba125428409-kube-api-access-vsbdc\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.278517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.280921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.280359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/221aef81-7106-48e5-8081-4ba125428409-tmp\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.281492 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.281062 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5b6788a2-861c-4390-b69b-1c2415577459-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.282719 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.282692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.283465 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.283445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-hub\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.283591 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.283498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-ca\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.283988 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.283956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5b6788a2-861c-4390-b69b-1c2415577459-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.284224 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.284208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/221aef81-7106-48e5-8081-4ba125428409-klusterlet-config\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.289266 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.289247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdw5\" (UniqueName: \"kubernetes.io/projected/5b6788a2-861c-4390-b69b-1c2415577459-kube-api-access-2mdw5\") pod \"cluster-proxy-proxy-agent-54495bbd66-pxk6s\" (UID: \"5b6788a2-861c-4390-b69b-1c2415577459\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.291086 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.291065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsbdc\" (UniqueName: \"kubernetes.io/projected/221aef81-7106-48e5-8081-4ba125428409-kube-api-access-vsbdc\") pod \"klusterlet-addon-workmgr-674fbc444c-djk9f\" (UID: \"221aef81-7106-48e5-8081-4ba125428409\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.327084 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.327063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:09.385434 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.385396 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:17:09.528971 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.528923 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f"] Apr 17 11:17:09.531749 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.531726 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s"] Apr 17 11:17:09.533515 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:17:09.533489 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221aef81_7106_48e5_8081_4ba125428409.slice/crio-7d56ca0b56cc08b5bcbd336fe8fe6a31208796976cc8eeb759b3ae07ac1aa8a7 WatchSource:0}: Error finding container 7d56ca0b56cc08b5bcbd336fe8fe6a31208796976cc8eeb759b3ae07ac1aa8a7: Status 404 returned error can't find the container with id 7d56ca0b56cc08b5bcbd336fe8fe6a31208796976cc8eeb759b3ae07ac1aa8a7 Apr 17 11:17:09.534547 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:17:09.534520 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b6788a2_861c_4390_b69b_1c2415577459.slice/crio-bdd1b7c039139a151dd6eaf03bd76c6b05f0f3c2383680545dcb68b579fa34ac WatchSource:0}: Error finding container bdd1b7c039139a151dd6eaf03bd76c6b05f0f3c2383680545dcb68b579fa34ac: Status 404 returned error can't find the container with id bdd1b7c039139a151dd6eaf03bd76c6b05f0f3c2383680545dcb68b579fa34ac Apr 17 11:17:09.580983 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.580959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:09.581087 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:09.581060 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:09.581160 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:09.581135 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:13.581098218 +0000 UTC m=+40.629268085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:17:09.681825 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.681788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:09.681965 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:09.681937 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:09.682015 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:09.682005 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:13.681985384 +0000 UTC m=+40.730155252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:17:09.713987 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.713961 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6311d15-5f59-4e6c-8732-269f06b40c16" containerID="69537d0cbd4d1581769702cdbab353e2a8437b5ea60354d6307d49e06f5e4fa9" exitCode=0 Apr 17 11:17:09.714371 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.714027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerDied","Data":"69537d0cbd4d1581769702cdbab353e2a8437b5ea60354d6307d49e06f5e4fa9"} Apr 17 11:17:09.715133 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.715096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" event={"ID":"5b6788a2-861c-4390-b69b-1c2415577459","Type":"ContainerStarted","Data":"bdd1b7c039139a151dd6eaf03bd76c6b05f0f3c2383680545dcb68b579fa34ac"} Apr 17 11:17:09.716014 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:09.715985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" event={"ID":"221aef81-7106-48e5-8081-4ba125428409","Type":"ContainerStarted","Data":"7d56ca0b56cc08b5bcbd336fe8fe6a31208796976cc8eeb759b3ae07ac1aa8a7"} Apr 17 11:17:10.723299 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:10.723262 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6311d15-5f59-4e6c-8732-269f06b40c16" containerID="a63e84c3d7db653c139f8c6cb90f97fdc6a89efcd93e9886614e47d1338b681f" exitCode=0 Apr 17 11:17:10.723754 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:10.723325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerDied","Data":"a63e84c3d7db653c139f8c6cb90f97fdc6a89efcd93e9886614e47d1338b681f"} Apr 17 11:17:11.730575 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:11.730533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" event={"ID":"e6311d15-5f59-4e6c-8732-269f06b40c16","Type":"ContainerStarted","Data":"cfc0aaf1018eb904ed207cafdec996b1b7cf76696ebbac7b777f45ec82f1475c"} Apr 17 11:17:11.756732 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:11.755947 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w2d4x" podStartSLOduration=6.416586208 podStartE2EDuration="38.755930161s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:16:36.14895444 +0000 UTC m=+3.197124320" lastFinishedPulling="2026-04-17 11:17:08.488298386 +0000 UTC m=+35.536468273" observedRunningTime="2026-04-17 11:17:11.75524475 +0000 UTC m=+38.803414639" watchObservedRunningTime="2026-04-17 11:17:11.755930161 +0000 UTC m=+38.804100050" Apr 17 11:17:13.626086 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:13.626048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:13.626537 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:13.626243 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:13.626537 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:13.626335 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:21.626315497 +0000 UTC m=+48.674485380 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:17:13.726595 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:13.726560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:13.726793 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:13.726701 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:13.726793 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:13.726773 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:21.726750943 +0000 UTC m=+48.774920828 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:17:14.738025 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:14.737987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" event={"ID":"5b6788a2-861c-4390-b69b-1c2415577459","Type":"ContainerStarted","Data":"91922abb7c5647959e44e84a99f979b1cfcd2f167c786689a3cb48d45d9fdbaf"} Apr 17 11:17:14.739309 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:14.739285 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" event={"ID":"221aef81-7106-48e5-8081-4ba125428409","Type":"ContainerStarted","Data":"6b2a277cee0eed44ee02ff8ef67ce385040b80b64c724450f406ef634e74b717"} Apr 17 11:17:14.739539 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:14.739518 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:14.741132 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:14.741100 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:17:14.755715 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:14.755668 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" podStartSLOduration=2.429695116 podStartE2EDuration="6.755652844s" podCreationTimestamp="2026-04-17 11:17:08 +0000 UTC" firstStartedPulling="2026-04-17 11:17:09.535455059 +0000 UTC m=+36.583624928" lastFinishedPulling="2026-04-17 11:17:13.861412791 +0000 UTC m=+40.909582656" observedRunningTime="2026-04-17 11:17:14.754653713 +0000 UTC m=+41.802823600" watchObservedRunningTime="2026-04-17 11:17:14.755652844 +0000 UTC m=+41.803822735" Apr 17 11:17:17.747221 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:17.747182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" event={"ID":"5b6788a2-861c-4390-b69b-1c2415577459","Type":"ContainerStarted","Data":"ad0e99bbacc04fba498b271f8331fd5f4a1b1f36119458a66e148ae3698d8e75"} Apr 17 11:17:17.747221 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:17.747223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" event={"ID":"5b6788a2-861c-4390-b69b-1c2415577459","Type":"ContainerStarted","Data":"accda3ceb0ef42c0cc46734e5b0a37443564e2cbad876b71b2db87b6fc52ea73"} Apr 17 11:17:17.766690 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:17.766647 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" podStartSLOduration=1.173447363 podStartE2EDuration="8.766635117s" podCreationTimestamp="2026-04-17 11:17:09 +0000 UTC" firstStartedPulling="2026-04-17 11:17:09.536658446 +0000 UTC m=+36.584828327" lastFinishedPulling="2026-04-17 11:17:17.1298462 +0000 UTC m=+44.178016081" observedRunningTime="2026-04-17 11:17:17.765862612 +0000 UTC m=+44.814032502" watchObservedRunningTime="2026-04-17 11:17:17.766635117 +0000 UTC m=+44.814804998" Apr 17 11:17:21.684285 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:21.684246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:21.684734 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:21.684400 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:21.684734 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:21.684466 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:37.684450424 +0000 UTC m=+64.732620290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:17:21.785301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:21.785272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:21.785444 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:21.785425 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:21.785496 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:21.785486 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:37.785471275 +0000 UTC m=+64.833641141 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:17:37.693395 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:37.693343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:17:37.693993 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:37.693506 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:37.693993 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:37.693589 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:18:09.693569351 +0000 UTC m=+96.741739222 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:17:37.794481 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:37.794440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:17:37.794602 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:37.794584 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:37.794662 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:37.794653 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:09.794636891 +0000 UTC m=+96.842806758 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:17:39.308184 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.308140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:39.308588 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.308191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:17:39.310669 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.310650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:39.310741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.310723 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:39.319366 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:39.319347 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:39.319478 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:17:39.319413 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:18:43.319393 +0000 UTC m=+130.367562866 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : secret "metrics-daemon-secret" not found Apr 17 11:17:39.321092 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.321077 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:39.333322 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.333297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z28j\" (UniqueName: \"kubernetes.io/projected/08f32961-6393-4bcc-a8bf-c27e9df01e0e-kube-api-access-2z28j\") pod \"network-check-target-9975j\" (UID: \"08f32961-6393-4bcc-a8bf-c27e9df01e0e\") " pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:39.350372 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.350354 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gg44j\"" Apr 17 11:17:39.359299 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.359285 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:39.467694 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.467665 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9975j"] Apr 17 11:17:39.471086 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:17:39.471058 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08f32961_6393_4bcc_a8bf_c27e9df01e0e.slice/crio-594abb9501fe25dab4fc1f9c6993097bcc76e4c2e2f92b2ef64dfde42b91b1b0 WatchSource:0}: Error finding container 594abb9501fe25dab4fc1f9c6993097bcc76e4c2e2f92b2ef64dfde42b91b1b0: Status 404 returned error can't find the container with id 594abb9501fe25dab4fc1f9c6993097bcc76e4c2e2f92b2ef64dfde42b91b1b0 Apr 17 11:17:39.804776 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:39.804745 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9975j" event={"ID":"08f32961-6393-4bcc-a8bf-c27e9df01e0e","Type":"ContainerStarted","Data":"594abb9501fe25dab4fc1f9c6993097bcc76e4c2e2f92b2ef64dfde42b91b1b0"} Apr 17 11:17:42.812374 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:42.812338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9975j" event={"ID":"08f32961-6393-4bcc-a8bf-c27e9df01e0e","Type":"ContainerStarted","Data":"e8161744946b405647ea379ab979b8befcfacadeb5497aa72807ad59c63470de"} Apr 17 11:17:42.812819 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:42.812461 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:17:42.828593 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:17:42.828555 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9975j" podStartSLOduration=67.189903837 podStartE2EDuration="1m9.828541805s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:17:39.472815185 +0000 UTC m=+66.520985052" lastFinishedPulling="2026-04-17 11:17:42.111453152 +0000 UTC m=+69.159623020" observedRunningTime="2026-04-17 11:17:42.828159047 +0000 UTC m=+69.876328934" watchObservedRunningTime="2026-04-17 11:17:42.828541805 +0000 UTC m=+69.876711693" Apr 17 11:18:09.721277 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:18:09.721245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:18:09.721650 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:18:09.721389 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:18:09.721650 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:18:09.721448 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert podName:dd572e4e-2d5a-47ec-8142-2e264dae6c8b nodeName:}" failed. No retries permitted until 2026-04-17 11:19:13.721433459 +0000 UTC m=+160.769603325 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert") pod "ingress-canary-nd4tb" (UID: "dd572e4e-2d5a-47ec-8142-2e264dae6c8b") : secret "canary-serving-cert" not found Apr 17 11:18:09.822284 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:18:09.822240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:18:09.822404 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:18:09.822382 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:18:09.822459 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:18:09.822449 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls podName:997fc9f1-c739-43ee-8f0f-bcc328a8b37f nodeName:}" failed. No retries permitted until 2026-04-17 11:19:13.822432875 +0000 UTC m=+160.870602742 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls") pod "dns-default-2n6g2" (UID: "997fc9f1-c739-43ee-8f0f-bcc328a8b37f") : secret "dns-default-metrics-tls" not found Apr 17 11:18:13.817706 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:18:13.817675 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9975j" Apr 17 11:18:43.354961 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:18:43.354924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:18:43.355441 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:18:43.355075 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:43.355441 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:18:43.355152 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs podName:0caad504-c16e-477e-b9a9-80928417640e nodeName:}" failed. No retries permitted until 2026-04-17 11:20:45.355136623 +0000 UTC m=+252.403306489 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs") pod "network-metrics-daemon-tl874" (UID: "0caad504-c16e-477e-b9a9-80928417640e") : secret "metrics-daemon-secret" not found Apr 17 11:18:51.570902 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:18:51.570874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hq6cq_cd45e2c2-be74-4898-855b-e51a00ea7a92/dns-node-resolver/0.log" Apr 17 11:18:52.171001 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:18:52.170974 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mbc4v_1ad8d75d-d9b5-45d3-ada0-68b4d648c30f/node-ca/0.log" Apr 17 11:19:08.872428 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:19:08.872367 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-nd4tb" podUID="dd572e4e-2d5a-47ec-8142-2e264dae6c8b" Apr 17 11:19:08.889651 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:19:08.889622 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2n6g2" podUID="997fc9f1-c739-43ee-8f0f-bcc328a8b37f" Apr 17 11:19:09.011672 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:09.011641 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:19:09.011828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:09.011645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2n6g2" Apr 17 11:19:10.555634 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:19:10.555571 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tl874" podUID="0caad504-c16e-477e-b9a9-80928417640e" Apr 17 11:19:10.618460 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.618425 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9mvrz"] Apr 17 11:19:10.621425 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.621403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.623400 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.623378 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:19:10.623507 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.623379 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:19:10.624146 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.624127 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:19:10.624146 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.624138 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:19:10.624332 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.624237 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5zvgq\"" Apr 17 11:19:10.632584 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.632562 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9mvrz"] Apr 17 11:19:10.644690 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.644671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-crio-socket\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.644771 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.644700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.644771 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.644719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n926r\" (UniqueName: \"kubernetes.io/projected/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-kube-api-access-n926r\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.644845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.644804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-data-volume\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.644878 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.644842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.651761 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.651741 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-869d7b8bf7-cxst8"] Apr 17 11:19:10.654334 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.654319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.656486 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.656471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qf8qh\"" Apr 17 11:19:10.656570 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.656471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:19:10.656662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.656649 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:19:10.656723 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.656694 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:19:10.663081 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.663063 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:19:10.668362 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.668344 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-869d7b8bf7-cxst8"] Apr 17 11:19:10.745306 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.745470 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58952672-09f4-40b6-ac40-4902268e9ea4-installation-pull-secrets\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.745470 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-crio-socket\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.745470 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/58952672-09f4-40b6-ac40-4902268e9ea4-image-registry-private-configuration\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.745470 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt52c\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-kube-api-access-zt52c\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.745470 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-crio-socket\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n926r\" (UniqueName: \"kubernetes.io/projected/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-kube-api-access-n926r\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-registry-tls\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58952672-09f4-40b6-ac40-4902268e9ea4-ca-trust-extracted\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58952672-09f4-40b6-ac40-4902268e9ea4-registry-certificates\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-data-volume\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-bound-sa-token\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.745736 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745721 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58952672-09f4-40b6-ac40-4902268e9ea4-trusted-ca\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.746072 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-data-volume\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.746072 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.745923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.747637 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.747620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.755253 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.755228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n926r\" (UniqueName: \"kubernetes.io/projected/a4d42c8e-cb41-4cbc-847e-55eedee9b9c1-kube-api-access-n926r\") pod \"insights-runtime-extractor-9mvrz\" (UID: \"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1\") " pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.846794 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.846764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-registry-tls\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.846939 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.846812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58952672-09f4-40b6-ac40-4902268e9ea4-ca-trust-extracted\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.846939 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.846834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58952672-09f4-40b6-ac40-4902268e9ea4-registry-certificates\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847073 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-bound-sa-token\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847166 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58952672-09f4-40b6-ac40-4902268e9ea4-trusted-ca\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847225 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58952672-09f4-40b6-ac40-4902268e9ea4-installation-pull-secrets\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847225 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58952672-09f4-40b6-ac40-4902268e9ea4-ca-trust-extracted\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847321 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/58952672-09f4-40b6-ac40-4902268e9ea4-image-registry-private-configuration\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847321 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zt52c\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-kube-api-access-zt52c\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847734 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58952672-09f4-40b6-ac40-4902268e9ea4-registry-certificates\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.847933 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.847916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58952672-09f4-40b6-ac40-4902268e9ea4-trusted-ca\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.849293 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.849264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-registry-tls\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.849397 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.849364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58952672-09f4-40b6-ac40-4902268e9ea4-installation-pull-secrets\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.849859 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.849842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/58952672-09f4-40b6-ac40-4902268e9ea4-image-registry-private-configuration\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.856256 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.856238 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-bound-sa-token\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.857199 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.857178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt52c\" (UniqueName: \"kubernetes.io/projected/58952672-09f4-40b6-ac40-4902268e9ea4-kube-api-access-zt52c\") pod \"image-registry-869d7b8bf7-cxst8\" (UID: \"58952672-09f4-40b6-ac40-4902268e9ea4\") " pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:10.929974 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.929940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9mvrz" Apr 17 11:19:10.963046 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:10.963016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:11.054602 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:11.054573 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9mvrz"] Apr 17 11:19:11.059071 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:19:11.059046 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d42c8e_cb41_4cbc_847e_55eedee9b9c1.slice/crio-6b02f4f70b0523f2edca71c162ea853992c6c60da273bbf9f22ca986fe4645d5 WatchSource:0}: Error finding container 6b02f4f70b0523f2edca71c162ea853992c6c60da273bbf9f22ca986fe4645d5: Status 404 returned error can't find the container with id 6b02f4f70b0523f2edca71c162ea853992c6c60da273bbf9f22ca986fe4645d5 Apr 17 11:19:11.097645 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:11.097584 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-869d7b8bf7-cxst8"] Apr 17 11:19:11.101958 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:19:11.101930 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58952672_09f4_40b6_ac40_4902268e9ea4.slice/crio-a608119d6d948b42280080cdbf55f40ac5f8118755a6c4d414ffeb0ddc4fcca8 WatchSource:0}: Error finding container a608119d6d948b42280080cdbf55f40ac5f8118755a6c4d414ffeb0ddc4fcca8: Status 404 returned error can't find the container with id a608119d6d948b42280080cdbf55f40ac5f8118755a6c4d414ffeb0ddc4fcca8 Apr 17 11:19:12.020981 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:12.020946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mvrz" event={"ID":"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1","Type":"ContainerStarted","Data":"b09070538d9ec3fdaccec26dc054e129f59cdf31ac189517f5372d3612385cce"} Apr 17 11:19:12.020981 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:12.020982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mvrz" event={"ID":"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1","Type":"ContainerStarted","Data":"c30c7648e7f19b66ac7c7a50887a5c67052c4fc9bc505cf6a8c45349045441a9"} Apr 17 11:19:12.021459 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:12.020992 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mvrz" event={"ID":"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1","Type":"ContainerStarted","Data":"6b02f4f70b0523f2edca71c162ea853992c6c60da273bbf9f22ca986fe4645d5"} Apr 17 11:19:12.022204 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:12.022181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" event={"ID":"58952672-09f4-40b6-ac40-4902268e9ea4","Type":"ContainerStarted","Data":"1cfdb1e5f33453d597e9dc02c00a993475aa85116b9f915974ac44baaed2ce05"} Apr 17 11:19:12.022292 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:12.022210 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" event={"ID":"58952672-09f4-40b6-ac40-4902268e9ea4","Type":"ContainerStarted","Data":"a608119d6d948b42280080cdbf55f40ac5f8118755a6c4d414ffeb0ddc4fcca8"} Apr 17 11:19:12.022341 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:12.022318 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:12.041563 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:12.041518 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" podStartSLOduration=2.041504817 podStartE2EDuration="2.041504817s" podCreationTimestamp="2026-04-17 11:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:19:12.040614432 +0000 UTC m=+159.088784332" watchObservedRunningTime="2026-04-17 11:19:12.041504817 +0000 UTC m=+159.089674704" Apr 17 11:19:13.771924 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:13.771884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:19:13.774186 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:13.774164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd572e4e-2d5a-47ec-8142-2e264dae6c8b-cert\") pod \"ingress-canary-nd4tb\" (UID: \"dd572e4e-2d5a-47ec-8142-2e264dae6c8b\") " pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:19:13.816021 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:13.815994 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5xzqf\"" Apr 17 11:19:13.822797 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:13.822782 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nd4tb" Apr 17 11:19:13.873822 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:13.873790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:19:13.876478 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:13.876456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/997fc9f1-c739-43ee-8f0f-bcc328a8b37f-metrics-tls\") pod \"dns-default-2n6g2\" (UID: \"997fc9f1-c739-43ee-8f0f-bcc328a8b37f\") " pod="openshift-dns/dns-default-2n6g2" Apr 17 11:19:13.936542 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:13.936511 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nd4tb"] Apr 17 11:19:13.939413 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:19:13.939386 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd572e4e_2d5a_47ec_8142_2e264dae6c8b.slice/crio-316e757ec2ea5068de160503385898870c70c22072b0677ba989da92c0c67992 WatchSource:0}: Error finding container 316e757ec2ea5068de160503385898870c70c22072b0677ba989da92c0c67992: Status 404 returned error can't find the container with id 316e757ec2ea5068de160503385898870c70c22072b0677ba989da92c0c67992 Apr 17 11:19:14.028512 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.028432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9mvrz" event={"ID":"a4d42c8e-cb41-4cbc-847e-55eedee9b9c1","Type":"ContainerStarted","Data":"6bfef462aad99446c92db9ff9b3ab7df6e2e4c45a8e52280cf4ffbb2bef4b944"} Apr 17 11:19:14.029672 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.029648 2577 generic.go:358] "Generic (PLEG): container finished" podID="221aef81-7106-48e5-8081-4ba125428409" containerID="6b2a277cee0eed44ee02ff8ef67ce385040b80b64c724450f406ef634e74b717" exitCode=1 Apr 17 11:19:14.029782 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.029710 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" event={"ID":"221aef81-7106-48e5-8081-4ba125428409","Type":"ContainerDied","Data":"6b2a277cee0eed44ee02ff8ef67ce385040b80b64c724450f406ef634e74b717"} Apr 17 11:19:14.030018 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.029997 2577 scope.go:117] "RemoveContainer" containerID="6b2a277cee0eed44ee02ff8ef67ce385040b80b64c724450f406ef634e74b717" Apr 17 11:19:14.030712 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.030692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nd4tb" event={"ID":"dd572e4e-2d5a-47ec-8142-2e264dae6c8b","Type":"ContainerStarted","Data":"316e757ec2ea5068de160503385898870c70c22072b0677ba989da92c0c67992"} Apr 17 11:19:14.045766 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.045730 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9mvrz" podStartSLOduration=2.022429242 podStartE2EDuration="4.045715794s" podCreationTimestamp="2026-04-17 11:19:10 +0000 UTC" firstStartedPulling="2026-04-17 11:19:11.120014099 +0000 UTC m=+158.168183969" lastFinishedPulling="2026-04-17 11:19:13.143300652 +0000 UTC m=+160.191470521" observedRunningTime="2026-04-17 11:19:14.045455832 +0000 UTC m=+161.093625720" watchObservedRunningTime="2026-04-17 11:19:14.045715794 +0000 UTC m=+161.093885681" Apr 17 11:19:14.114189 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.114162 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v5zg9\"" Apr 17 11:19:14.123084 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.123061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2n6g2" Apr 17 11:19:14.237345 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.237304 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2n6g2"] Apr 17 11:19:14.242968 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:19:14.242938 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod997fc9f1_c739_43ee_8f0f_bcc328a8b37f.slice/crio-2214217dc1fce1b70b1619d30f6e9cb9bf2cafb9dd384e66f905df1b65c39502 WatchSource:0}: Error finding container 2214217dc1fce1b70b1619d30f6e9cb9bf2cafb9dd384e66f905df1b65c39502: Status 404 returned error can't find the container with id 2214217dc1fce1b70b1619d30f6e9cb9bf2cafb9dd384e66f905df1b65c39502 Apr 17 11:19:14.739768 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:14.739734 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:19:15.035016 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:15.034918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2n6g2" event={"ID":"997fc9f1-c739-43ee-8f0f-bcc328a8b37f","Type":"ContainerStarted","Data":"2214217dc1fce1b70b1619d30f6e9cb9bf2cafb9dd384e66f905df1b65c39502"} Apr 17 11:19:15.036790 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:15.036761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" event={"ID":"221aef81-7106-48e5-8081-4ba125428409","Type":"ContainerStarted","Data":"a9dc2c101c8b5dda6c57886805a70b50e3e44a7d8426bdd01fe26ee861e26886"} Apr 17 11:19:16.041661 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:16.041602 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nd4tb" event={"ID":"dd572e4e-2d5a-47ec-8142-2e264dae6c8b","Type":"ContainerStarted","Data":"7f592a63e9c7be2ccfe60f0016ef710414e842b1ba277f5276842408b7f70f9d"} Apr 17 11:19:16.042054 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:16.041926 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:19:16.042631 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:16.042611 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-674fbc444c-djk9f" Apr 17 11:19:16.057577 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:16.057535 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nd4tb" podStartSLOduration=129.051900125 podStartE2EDuration="2m11.057521928s" podCreationTimestamp="2026-04-17 11:17:05 +0000 UTC" firstStartedPulling="2026-04-17 11:19:13.941202214 +0000 UTC m=+160.989372080" lastFinishedPulling="2026-04-17 11:19:15.946824018 +0000 UTC m=+162.994993883" observedRunningTime="2026-04-17 11:19:16.056517002 +0000 UTC m=+163.104686891" watchObservedRunningTime="2026-04-17 11:19:16.057521928 +0000 UTC m=+163.105691794" Apr 17 11:19:17.045344 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:17.045301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2n6g2" event={"ID":"997fc9f1-c739-43ee-8f0f-bcc328a8b37f","Type":"ContainerStarted","Data":"fafa7e2a88490c26698b4248e9b02a425586147fea5fa41cb9116ee9d0fba767"} Apr 17 11:19:17.045344 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:17.045344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2n6g2" event={"ID":"997fc9f1-c739-43ee-8f0f-bcc328a8b37f","Type":"ContainerStarted","Data":"d74a74c95f4ce0677509fc3fc1e63e14ab99164f81e74bb525568a53a2619c10"} Apr 17 11:19:17.045858 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:17.045581 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2n6g2" Apr 17 11:19:17.067167 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:17.067105 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2n6g2" podStartSLOduration=130.363195496 podStartE2EDuration="2m12.067094272s" podCreationTimestamp="2026-04-17 11:17:05 +0000 UTC" firstStartedPulling="2026-04-17 11:19:14.244839595 +0000 UTC m=+161.293009464" lastFinishedPulling="2026-04-17 11:19:15.948738374 +0000 UTC m=+162.996908240" observedRunningTime="2026-04-17 11:19:17.065539383 +0000 UTC m=+164.113709273" watchObservedRunningTime="2026-04-17 11:19:17.067094272 +0000 UTC m=+164.115264185" Apr 17 11:19:24.888924 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.888847 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cgcx6"] Apr 17 11:19:24.891913 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.891892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.897607 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.897580 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:19:24.897756 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.897581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:19:24.897756 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.897586 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:19:24.898219 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.898200 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:19:24.898298 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.898222 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2grgd\"" Apr 17 11:19:24.898356 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.898331 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:19:24.898489 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.898472 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:19:24.951795 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.951770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-accelerators-collector-config\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.951898 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.951805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-sys\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.951898 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.951839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-root\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.951964 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.951939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-tls\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.951997 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.951966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3076be-7073-467c-8065-01e1192ff5f6-metrics-client-ca\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.951997 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.951988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-textfile\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.952063 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.952010 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.952063 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.952030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpgqp\" (UniqueName: \"kubernetes.io/projected/ab3076be-7073-467c-8065-01e1192ff5f6-kube-api-access-wpgqp\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:24.952063 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:24.952055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-wtmp\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053172 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-accelerators-collector-config\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053303 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-sys\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053303 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053205 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-root\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053303 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-tls\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053303 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3076be-7073-467c-8065-01e1192ff5f6-metrics-client-ca\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053303 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-textfile\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053545 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-sys\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053545 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-root\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053545 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053545 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:19:25.053366 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:19:25.053545 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpgqp\" (UniqueName: \"kubernetes.io/projected/ab3076be-7073-467c-8065-01e1192ff5f6-kube-api-access-wpgqp\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053545 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:19:25.053425 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-tls podName:ab3076be-7073-467c-8065-01e1192ff5f6 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:25.553406256 +0000 UTC m=+172.601576126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-tls") pod "node-exporter-cgcx6" (UID: "ab3076be-7073-467c-8065-01e1192ff5f6") : secret "node-exporter-tls" not found Apr 17 11:19:25.053545 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-wtmp\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053840 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-wtmp\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053840 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-textfile\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053840 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-accelerators-collector-config\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.053941 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.053904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3076be-7073-467c-8065-01e1192ff5f6-metrics-client-ca\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.055609 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.055590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.061615 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.061594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpgqp\" (UniqueName: \"kubernetes.io/projected/ab3076be-7073-467c-8065-01e1192ff5f6-kube-api-access-wpgqp\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.535232 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.535194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:19:25.555476 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.555444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-tls\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.557667 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.557646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab3076be-7073-467c-8065-01e1192ff5f6-node-exporter-tls\") pod \"node-exporter-cgcx6\" (UID: \"ab3076be-7073-467c-8065-01e1192ff5f6\") " pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.801045 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:25.800958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cgcx6" Apr 17 11:19:25.809007 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:19:25.808981 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab3076be_7073_467c_8065_01e1192ff5f6.slice/crio-e998dd784a7cf4909a247b60496e1da4fa235796154291587cd74401b41aad08 WatchSource:0}: Error finding container e998dd784a7cf4909a247b60496e1da4fa235796154291587cd74401b41aad08: Status 404 returned error can't find the container with id e998dd784a7cf4909a247b60496e1da4fa235796154291587cd74401b41aad08 Apr 17 11:19:26.067272 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:26.067188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgcx6" event={"ID":"ab3076be-7073-467c-8065-01e1192ff5f6","Type":"ContainerStarted","Data":"e998dd784a7cf4909a247b60496e1da4fa235796154291587cd74401b41aad08"} Apr 17 11:19:27.049394 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:27.049368 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2n6g2" Apr 17 11:19:27.070711 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:27.070680 2577 generic.go:358] "Generic (PLEG): container finished" podID="ab3076be-7073-467c-8065-01e1192ff5f6" containerID="04bcbe3b7e88b253ea79c35185fc6d7be30cfb937834a5ebd56039717c17db67" exitCode=0 Apr 17 11:19:27.071049 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:27.070744 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgcx6" event={"ID":"ab3076be-7073-467c-8065-01e1192ff5f6","Type":"ContainerDied","Data":"04bcbe3b7e88b253ea79c35185fc6d7be30cfb937834a5ebd56039717c17db67"} Apr 17 11:19:28.074493 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:28.074459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgcx6" event={"ID":"ab3076be-7073-467c-8065-01e1192ff5f6","Type":"ContainerStarted","Data":"d9ac58c7bdf09d47e8b1d735d03b6c76a1759b038ea2f6b99bcbbd115cee3985"} Apr 17 11:19:28.074493 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:28.074496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgcx6" event={"ID":"ab3076be-7073-467c-8065-01e1192ff5f6","Type":"ContainerStarted","Data":"fd8fb2466761b5f5d91372738a05e1acb3e8a936f76edf7d6b73d63943b48885"} Apr 17 11:19:28.093779 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:28.093732 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cgcx6" podStartSLOduration=3.342118215 podStartE2EDuration="4.093719594s" podCreationTimestamp="2026-04-17 11:19:24 +0000 UTC" firstStartedPulling="2026-04-17 11:19:25.810865857 +0000 UTC m=+172.859035723" lastFinishedPulling="2026-04-17 11:19:26.56246722 +0000 UTC m=+173.610637102" observedRunningTime="2026-04-17 11:19:28.09281818 +0000 UTC m=+175.140988067" watchObservedRunningTime="2026-04-17 11:19:28.093719594 +0000 UTC m=+175.141889482" Apr 17 11:19:30.967320 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:30.967287 2577 patch_prober.go:28] interesting pod/image-registry-869d7b8bf7-cxst8 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:19:30.967675 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:30.967341 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" podUID="58952672-09f4-40b6-ac40-4902268e9ea4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:19:31.186731 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.186705 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:19:31.190339 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.190322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.192566 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.192546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:19:31.193131 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:19:31.193277 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193196 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:19:31.193277 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193242 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:19:31.193440 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193244 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:19:31.193440 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193349 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:19:31.193523 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193445 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:19:31.193720 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193703 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-49ltg\"" Apr 17 11:19:31.193787 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193771 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f2pjp4j84f30v\"" Apr 17 11:19:31.193874 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193850 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:19:31.193974 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.193935 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:19:31.194218 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.194200 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:19:31.194327 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.194285 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:19:31.199378 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.199359 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:19:31.201656 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.201641 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:19:31.210078 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.210058 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:19:31.300827 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.300827 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-web-config\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.300970 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.300970 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.300970 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.300970 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.300970 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.300978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gknkc\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-kube-api-access-gknkc\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config-out\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301144 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.301354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.301221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402317 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402429 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402429 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-web-config\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402429 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402555 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402555 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402487 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402555 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402669 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402669 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402669 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gknkc\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-kube-api-access-gknkc\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402669 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config-out\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402669 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402901 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402901 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402901 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402901 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402901 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.402901 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.402848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.403345 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.403321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.403495 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.403330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.403609 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.403514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.404363 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.404343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.404787 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.404768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.405657 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.405541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-web-config\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.405657 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.405642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.405883 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.405864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.406091 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.406050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.406440 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.406411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.406657 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.406638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.406776 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.406755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.407513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.407488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.408133 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.408081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.408280 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.408263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.408321 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.408273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config-out\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.408406 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.408390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.413416 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.413397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gknkc\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-kube-api-access-gknkc\") pod \"prometheus-k8s-0\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.500394 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.500366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.628974 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:31.628945 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:19:31.632296 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:19:31.632270 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1bf0c1_1bd8_42e2_b11e_c43388d9ae05.slice/crio-bbdaf4941c7c3279ef86013e361e98bc2d1ffce327bb0651b75d028e307ddb80 WatchSource:0}: Error finding container bbdaf4941c7c3279ef86013e361e98bc2d1ffce327bb0651b75d028e307ddb80: Status 404 returned error can't find the container with id bbdaf4941c7c3279ef86013e361e98bc2d1ffce327bb0651b75d028e307ddb80 Apr 17 11:19:32.085921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:32.085887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerStarted","Data":"bbdaf4941c7c3279ef86013e361e98bc2d1ffce327bb0651b75d028e307ddb80"} Apr 17 11:19:33.029522 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:33.029496 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-869d7b8bf7-cxst8" Apr 17 11:19:33.089944 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:33.089912 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" exitCode=0 Apr 17 11:19:33.090331 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:33.089983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94"} Apr 17 11:19:36.100479 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:36.100445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerStarted","Data":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} Apr 17 11:19:36.100833 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:36.100484 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerStarted","Data":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} Apr 17 11:19:38.108967 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:38.108924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerStarted","Data":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} Apr 17 11:19:38.108967 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:38.108966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerStarted","Data":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} Apr 17 11:19:38.109379 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:38.108980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerStarted","Data":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} Apr 17 11:19:38.109379 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:38.109013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerStarted","Data":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} Apr 17 11:19:38.137168 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:38.137126 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.296519801 podStartE2EDuration="7.137096026s" podCreationTimestamp="2026-04-17 11:19:31 +0000 UTC" firstStartedPulling="2026-04-17 11:19:31.63479946 +0000 UTC m=+178.682969325" lastFinishedPulling="2026-04-17 11:19:37.47537568 +0000 UTC m=+184.523545550" observedRunningTime="2026-04-17 11:19:38.136249598 +0000 UTC m=+185.184419504" watchObservedRunningTime="2026-04-17 11:19:38.137096026 +0000 UTC m=+185.185265914" Apr 17 11:19:41.501355 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:19:41.501312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:06.379593 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:06.379554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgcx6_ab3076be-7073-467c-8065-01e1192ff5f6/init-textfile/0.log" Apr 17 11:20:06.584299 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:06.584266 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgcx6_ab3076be-7073-467c-8065-01e1192ff5f6/node-exporter/0.log" Apr 17 11:20:06.784011 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:06.783942 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgcx6_ab3076be-7073-467c-8065-01e1192ff5f6/kube-rbac-proxy/0.log" Apr 17 11:20:08.780004 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:08.779976 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/init-config-reloader/0.log" Apr 17 11:20:08.985538 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:08.985510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/prometheus/0.log" Apr 17 11:20:09.181181 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:09.181152 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/config-reloader/0.log" Apr 17 11:20:09.380343 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:09.380317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/thanos-sidecar/0.log" Apr 17 11:20:09.581709 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:09.581636 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/kube-rbac-proxy-web/0.log" Apr 17 11:20:09.783711 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:09.783687 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/kube-rbac-proxy/0.log" Apr 17 11:20:09.980150 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:09.980106 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/kube-rbac-proxy-thanos/0.log" Apr 17 11:20:19.386513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:19.386473 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" podUID="5b6788a2-861c-4390-b69b-1c2415577459" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:20:29.390216 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:29.386776 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" podUID="5b6788a2-861c-4390-b69b-1c2415577459" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:20:31.501357 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:31.501316 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:31.519703 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:31.519678 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:32.261323 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:32.261300 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:39.387187 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:39.387144 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" podUID="5b6788a2-861c-4390-b69b-1c2415577459" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:20:39.387788 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:39.387225 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" Apr 17 11:20:39.387894 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:39.387855 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ad0e99bbacc04fba498b271f8331fd5f4a1b1f36119458a66e148ae3698d8e75"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 11:20:39.387955 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:39.387938 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" podUID="5b6788a2-861c-4390-b69b-1c2415577459" containerName="service-proxy" containerID="cri-o://ad0e99bbacc04fba498b271f8331fd5f4a1b1f36119458a66e148ae3698d8e75" gracePeriod=30 Apr 17 11:20:40.266513 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:40.266483 2577 generic.go:358] "Generic (PLEG): container finished" podID="5b6788a2-861c-4390-b69b-1c2415577459" containerID="ad0e99bbacc04fba498b271f8331fd5f4a1b1f36119458a66e148ae3698d8e75" exitCode=2 Apr 17 11:20:40.266687 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:40.266546 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" event={"ID":"5b6788a2-861c-4390-b69b-1c2415577459","Type":"ContainerDied","Data":"ad0e99bbacc04fba498b271f8331fd5f4a1b1f36119458a66e148ae3698d8e75"} Apr 17 11:20:40.266687 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:40.266572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54495bbd66-pxk6s" event={"ID":"5b6788a2-861c-4390-b69b-1c2415577459","Type":"ContainerStarted","Data":"b85f49a86d40e55ae06809e2fda64a3772a38b9d581f916d276ebdd7fcf5abb6"} Apr 17 11:20:45.368733 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:45.368696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:20:45.370901 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:45.370874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0caad504-c16e-477e-b9a9-80928417640e-metrics-certs\") pod \"network-metrics-daemon-tl874\" (UID: \"0caad504-c16e-477e-b9a9-80928417640e\") " pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:20:45.638220 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:45.638148 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ntppc\"" Apr 17 11:20:45.646095 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:45.646078 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tl874" Apr 17 11:20:45.762783 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:45.762752 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tl874"] Apr 17 11:20:45.767579 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:20:45.767543 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0caad504_c16e_477e_b9a9_80928417640e.slice/crio-761610c8356778d0c117a83a9288066b32531c066493955d0bdf4971f096c0c9 WatchSource:0}: Error finding container 761610c8356778d0c117a83a9288066b32531c066493955d0bdf4971f096c0c9: Status 404 returned error can't find the container with id 761610c8356778d0c117a83a9288066b32531c066493955d0bdf4971f096c0c9 Apr 17 11:20:46.280993 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:46.280952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tl874" event={"ID":"0caad504-c16e-477e-b9a9-80928417640e","Type":"ContainerStarted","Data":"761610c8356778d0c117a83a9288066b32531c066493955d0bdf4971f096c0c9"} Apr 17 11:20:48.287350 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:48.287310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tl874" event={"ID":"0caad504-c16e-477e-b9a9-80928417640e","Type":"ContainerStarted","Data":"afbdd97bb35d80559ca6f30151f35aa9c9d961bac3acb5521dd3571c097ef899"} Apr 17 11:20:48.287350 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:48.287351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tl874" event={"ID":"0caad504-c16e-477e-b9a9-80928417640e","Type":"ContainerStarted","Data":"21a5f34b8a71a8f51f6257546c475754bf753af5248c17f810fc308953a2ede7"} Apr 17 11:20:48.301949 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:48.301897 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tl874" podStartSLOduration=253.6594793 podStartE2EDuration="4m15.301881732s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:20:45.769418857 +0000 UTC m=+252.817588723" lastFinishedPulling="2026-04-17 11:20:47.411821289 +0000 UTC m=+254.459991155" observedRunningTime="2026-04-17 11:20:48.301045175 +0000 UTC m=+255.349215063" watchObservedRunningTime="2026-04-17 11:20:48.301881732 +0000 UTC m=+255.350051621" Apr 17 11:20:49.623991 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.623889 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:49.624754 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.624722 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="prometheus" containerID="cri-o://875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" gracePeriod=600 Apr 17 11:20:49.625031 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.624996 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy" containerID="cri-o://27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" gracePeriod=600 Apr 17 11:20:49.625152 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.625071 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="thanos-sidecar" containerID="cri-o://4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" gracePeriod=600 Apr 17 11:20:49.625372 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.625231 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-web" containerID="cri-o://3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" gracePeriod=600 Apr 17 11:20:49.625372 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.625288 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" gracePeriod=600 Apr 17 11:20:49.625511 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.625350 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="config-reloader" containerID="cri-o://f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" gracePeriod=600 Apr 17 11:20:49.863462 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:49.863440 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.008412 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008313 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-serving-certs-ca-bundle\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008412 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008362 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config-out\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008412 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008395 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008421 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gknkc\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-kube-api-access-gknkc\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008445 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-kubelet-serving-ca-bundle\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008484 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-tls-assets\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008522 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-metrics-client-certs\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008551 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-grpc-tls\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008579 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-rulefiles-0\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008621 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-trusted-ca-bundle\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008652 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.008689 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008678 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-db\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008706 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-tls\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008733 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-metrics-client-ca\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008761 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-kube-rbac-proxy\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008790 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-web-config\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008816 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008832 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008858 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-thanos-prometheus-http-client-file\") pod \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\" (UID: \"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05\") " Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.008865 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.009100 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.009140 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.009136 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.009687 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.009648 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:50.010080 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.010017 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:50.010858 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.010509 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:50.011937 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.011718 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:50.011937 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.011858 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.012226 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.012194 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.012527 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.012506 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config-out" (OuterVolumeSpecName: "config-out") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:50.012751 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.012711 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-kube-api-access-gknkc" (OuterVolumeSpecName: "kube-api-access-gknkc") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "kube-api-access-gknkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:50.012751 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.012740 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.013039 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.013000 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.013261 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.013234 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config" (OuterVolumeSpecName: "config") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.013339 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.013293 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.013479 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.013460 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:50.013613 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.013595 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.014356 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.014326 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.021495 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.021474 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-web-config" (OuterVolumeSpecName: "web-config") pod "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" (UID: "0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:50.109581 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109549 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109581 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109578 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-db\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109589 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109599 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-configmap-metrics-client-ca\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109609 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-kube-rbac-proxy\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109617 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-web-config\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109627 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109637 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109648 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-config-out\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109656 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109665 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gknkc\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-kube-api-access-gknkc\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109673 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-tls-assets\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109683 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-metrics-client-certs\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109692 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-secret-grpc-tls\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109700 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.109741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.109710 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-247.ec2.internal\" DevicePath \"\"" Apr 17 11:20:50.294741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294651 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" exitCode=0 Apr 17 11:20:50.294741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294685 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" exitCode=0 Apr 17 11:20:50.294741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294695 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" exitCode=0 Apr 17 11:20:50.294741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294704 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" exitCode=0 Apr 17 11:20:50.294741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294711 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" exitCode=0 Apr 17 11:20:50.294741 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294719 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" exitCode=0 Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294777 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294785 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294798 2577 scope.go:117] "RemoveContainer" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} Apr 17 11:20:50.295096 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.294973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05","Type":"ContainerDied","Data":"bbdaf4941c7c3279ef86013e361e98bc2d1ffce327bb0651b75d028e307ddb80"} Apr 17 11:20:50.302339 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.302319 2577 scope.go:117] "RemoveContainer" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.308819 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.308802 2577 scope.go:117] "RemoveContainer" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.314681 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.314667 2577 scope.go:117] "RemoveContainer" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.317994 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.317973 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:50.320747 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.320731 2577 scope.go:117] "RemoveContainer" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.325090 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.325067 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:50.329361 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.328207 2577 scope.go:117] "RemoveContainer" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.335527 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.335508 2577 scope.go:117] "RemoveContainer" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.341340 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.341321 2577 scope.go:117] "RemoveContainer" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.341593 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:20:50.341574 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": container with ID starting with d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa not found: ID does not exist" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.341656 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.341605 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} err="failed to get container status \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": rpc error: code = NotFound desc = could not find container \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": container with ID starting with d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa not found: ID does not exist" Apr 17 11:20:50.341656 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.341624 2577 scope.go:117] "RemoveContainer" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.341831 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:20:50.341818 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": container with ID starting with 27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d not found: ID does not exist" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.341872 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.341834 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} err="failed to get container status \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": rpc error: code = NotFound desc = could not find container \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": container with ID starting with 27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d not found: ID does not exist" Apr 17 11:20:50.341872 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.341848 2577 scope.go:117] "RemoveContainer" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.342049 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:20:50.342035 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": container with ID starting with 3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247 not found: ID does not exist" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.342091 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342052 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} err="failed to get container status \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": rpc error: code = NotFound desc = could not find container \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": container with ID starting with 3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247 not found: ID does not exist" Apr 17 11:20:50.342091 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342064 2577 scope.go:117] "RemoveContainer" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.342393 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:20:50.342375 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": container with ID starting with 4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602 not found: ID does not exist" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.342452 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342398 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} err="failed to get container status \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": rpc error: code = NotFound desc = could not find container \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": container with ID starting with 4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602 not found: ID does not exist" Apr 17 11:20:50.342452 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342413 2577 scope.go:117] "RemoveContainer" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.342611 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:20:50.342596 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": container with ID starting with f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3 not found: ID does not exist" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.342646 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342615 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} err="failed to get container status \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": rpc error: code = NotFound desc = could not find container \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": container with ID starting with f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3 not found: ID does not exist" Apr 17 11:20:50.342646 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342629 2577 scope.go:117] "RemoveContainer" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.342842 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:20:50.342822 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": container with ID starting with 875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e not found: ID does not exist" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.342896 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342851 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} err="failed to get container status \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": rpc error: code = NotFound desc = could not find container \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": container with ID starting with 875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e not found: ID does not exist" Apr 17 11:20:50.342896 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.342873 2577 scope.go:117] "RemoveContainer" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.343455 ip-10-0-142-247 kubenswrapper[2577]: E0417 11:20:50.343430 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": container with ID starting with 27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94 not found: ID does not exist" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.343517 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.343465 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94"} err="failed to get container status \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": rpc error: code = NotFound desc = could not find container \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": container with ID starting with 27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94 not found: ID does not exist" Apr 17 11:20:50.343517 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.343485 2577 scope.go:117] "RemoveContainer" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.343724 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.343698 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} err="failed to get container status \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": rpc error: code = NotFound desc = could not find container \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": container with ID starting with d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa not found: ID does not exist" Apr 17 11:20:50.343883 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.343726 2577 scope.go:117] "RemoveContainer" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.344002 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.343982 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} err="failed to get container status \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": rpc error: code = NotFound desc = could not find container \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": container with ID starting with 27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d not found: ID does not exist" Apr 17 11:20:50.344047 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344003 2577 scope.go:117] "RemoveContainer" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.344217 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344196 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} err="failed to get container status \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": rpc error: code = NotFound desc = could not find container \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": container with ID starting with 3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247 not found: ID does not exist" Apr 17 11:20:50.344263 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344217 2577 scope.go:117] "RemoveContainer" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.344394 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344376 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} err="failed to get container status \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": rpc error: code = NotFound desc = could not find container \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": container with ID starting with 4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602 not found: ID does not exist" Apr 17 11:20:50.344449 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344394 2577 scope.go:117] "RemoveContainer" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.344617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344601 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} err="failed to get container status \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": rpc error: code = NotFound desc = could not find container \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": container with ID starting with f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3 not found: ID does not exist" Apr 17 11:20:50.344662 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344618 2577 scope.go:117] "RemoveContainer" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.344860 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344842 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} err="failed to get container status \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": rpc error: code = NotFound desc = could not find container \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": container with ID starting with 875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e not found: ID does not exist" Apr 17 11:20:50.344924 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.344862 2577 scope.go:117] "RemoveContainer" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.345068 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345049 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94"} err="failed to get container status \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": rpc error: code = NotFound desc = could not find container \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": container with ID starting with 27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94 not found: ID does not exist" Apr 17 11:20:50.345128 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345068 2577 scope.go:117] "RemoveContainer" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.345276 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345258 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} err="failed to get container status \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": rpc error: code = NotFound desc = could not find container \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": container with ID starting with d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa not found: ID does not exist" Apr 17 11:20:50.345318 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345277 2577 scope.go:117] "RemoveContainer" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.345457 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345441 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} err="failed to get container status \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": rpc error: code = NotFound desc = could not find container \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": container with ID starting with 27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d not found: ID does not exist" Apr 17 11:20:50.345510 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345456 2577 scope.go:117] "RemoveContainer" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.345660 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345645 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} err="failed to get container status \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": rpc error: code = NotFound desc = could not find container \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": container with ID starting with 3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247 not found: ID does not exist" Apr 17 11:20:50.345660 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345660 2577 scope.go:117] "RemoveContainer" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.345842 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345823 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} err="failed to get container status \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": rpc error: code = NotFound desc = could not find container \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": container with ID starting with 4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602 not found: ID does not exist" Apr 17 11:20:50.345883 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.345842 2577 scope.go:117] "RemoveContainer" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.346029 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346013 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} err="failed to get container status \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": rpc error: code = NotFound desc = could not find container \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": container with ID starting with f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3 not found: ID does not exist" Apr 17 11:20:50.346071 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346031 2577 scope.go:117] "RemoveContainer" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.346213 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346195 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} err="failed to get container status \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": rpc error: code = NotFound desc = could not find container \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": container with ID starting with 875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e not found: ID does not exist" Apr 17 11:20:50.346213 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346211 2577 scope.go:117] "RemoveContainer" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.346392 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346376 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94"} err="failed to get container status \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": rpc error: code = NotFound desc = could not find container \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": container with ID starting with 27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94 not found: ID does not exist" Apr 17 11:20:50.346435 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346393 2577 scope.go:117] "RemoveContainer" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.346585 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346570 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} err="failed to get container status \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": rpc error: code = NotFound desc = could not find container \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": container with ID starting with d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa not found: ID does not exist" Apr 17 11:20:50.346638 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346586 2577 scope.go:117] "RemoveContainer" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.346730 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346716 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} err="failed to get container status \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": rpc error: code = NotFound desc = could not find container \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": container with ID starting with 27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d not found: ID does not exist" Apr 17 11:20:50.346777 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346729 2577 scope.go:117] "RemoveContainer" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.346916 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346901 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} err="failed to get container status \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": rpc error: code = NotFound desc = could not find container \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": container with ID starting with 3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247 not found: ID does not exist" Apr 17 11:20:50.346957 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.346916 2577 scope.go:117] "RemoveContainer" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.347073 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347058 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} err="failed to get container status \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": rpc error: code = NotFound desc = could not find container \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": container with ID starting with 4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602 not found: ID does not exist" Apr 17 11:20:50.347135 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347076 2577 scope.go:117] "RemoveContainer" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.347298 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347281 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} err="failed to get container status \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": rpc error: code = NotFound desc = could not find container \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": container with ID starting with f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3 not found: ID does not exist" Apr 17 11:20:50.347350 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347298 2577 scope.go:117] "RemoveContainer" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.347478 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347462 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} err="failed to get container status \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": rpc error: code = NotFound desc = could not find container \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": container with ID starting with 875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e not found: ID does not exist" Apr 17 11:20:50.347517 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347481 2577 scope.go:117] "RemoveContainer" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.347677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347661 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94"} err="failed to get container status \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": rpc error: code = NotFound desc = could not find container \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": container with ID starting with 27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94 not found: ID does not exist" Apr 17 11:20:50.347718 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347679 2577 scope.go:117] "RemoveContainer" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.347834 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347820 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} err="failed to get container status \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": rpc error: code = NotFound desc = could not find container \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": container with ID starting with d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa not found: ID does not exist" Apr 17 11:20:50.347869 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.347836 2577 scope.go:117] "RemoveContainer" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.348022 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348003 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} err="failed to get container status \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": rpc error: code = NotFound desc = could not find container \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": container with ID starting with 27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d not found: ID does not exist" Apr 17 11:20:50.348065 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348024 2577 scope.go:117] "RemoveContainer" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.348246 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348232 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} err="failed to get container status \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": rpc error: code = NotFound desc = could not find container \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": container with ID starting with 3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247 not found: ID does not exist" Apr 17 11:20:50.348287 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348247 2577 scope.go:117] "RemoveContainer" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.348444 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348421 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} err="failed to get container status \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": rpc error: code = NotFound desc = could not find container \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": container with ID starting with 4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602 not found: ID does not exist" Apr 17 11:20:50.348444 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348442 2577 scope.go:117] "RemoveContainer" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.348657 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348641 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} err="failed to get container status \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": rpc error: code = NotFound desc = could not find container \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": container with ID starting with f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3 not found: ID does not exist" Apr 17 11:20:50.348748 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348660 2577 scope.go:117] "RemoveContainer" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.348833 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348818 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} err="failed to get container status \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": rpc error: code = NotFound desc = could not find container \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": container with ID starting with 875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e not found: ID does not exist" Apr 17 11:20:50.348875 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348834 2577 scope.go:117] "RemoveContainer" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.349013 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.348996 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94"} err="failed to get container status \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": rpc error: code = NotFound desc = could not find container \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": container with ID starting with 27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94 not found: ID does not exist" Apr 17 11:20:50.349077 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349015 2577 scope.go:117] "RemoveContainer" containerID="d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa" Apr 17 11:20:50.349240 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349224 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa"} err="failed to get container status \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": rpc error: code = NotFound desc = could not find container \"d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa\": container with ID starting with d239b085ebb26c5c3d06faaa3af3188c36fc2feb780daf44a4590862b95605fa not found: ID does not exist" Apr 17 11:20:50.349289 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349240 2577 scope.go:117] "RemoveContainer" containerID="27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d" Apr 17 11:20:50.349410 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349396 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d"} err="failed to get container status \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": rpc error: code = NotFound desc = could not find container \"27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d\": container with ID starting with 27f223b3690503c92efeec464010465d986bedf9efac2cf0b458bf9f1343236d not found: ID does not exist" Apr 17 11:20:50.349453 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349409 2577 scope.go:117] "RemoveContainer" containerID="3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247" Apr 17 11:20:50.349577 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349560 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247"} err="failed to get container status \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": rpc error: code = NotFound desc = could not find container \"3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247\": container with ID starting with 3d4a6af9de11497732d3b957724d5fade261d59397e84f2ad504767924623247 not found: ID does not exist" Apr 17 11:20:50.349641 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349579 2577 scope.go:117] "RemoveContainer" containerID="4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602" Apr 17 11:20:50.349774 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349754 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602"} err="failed to get container status \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": rpc error: code = NotFound desc = could not find container \"4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602\": container with ID starting with 4dc4571c7bc342aedfb9a8aa0cefd443b45fc557461efd83e4a65ba636408602 not found: ID does not exist" Apr 17 11:20:50.349823 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349776 2577 scope.go:117] "RemoveContainer" containerID="f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3" Apr 17 11:20:50.349921 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349904 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3"} err="failed to get container status \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": rpc error: code = NotFound desc = could not find container \"f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3\": container with ID starting with f9878b8f686499e99e0c4c88ff4257715b86abeb98adb09ee70852b8cb54aae3 not found: ID does not exist" Apr 17 11:20:50.349965 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.349920 2577 scope.go:117] "RemoveContainer" containerID="875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e" Apr 17 11:20:50.350149 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.350132 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e"} err="failed to get container status \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": rpc error: code = NotFound desc = could not find container \"875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e\": container with ID starting with 875457c90e0a4b25e21e76f1ce7047b30f659fa9564c79784aa4bd1c7dfc6a7e not found: ID does not exist" Apr 17 11:20:50.350219 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.350150 2577 scope.go:117] "RemoveContainer" containerID="27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94" Apr 17 11:20:50.350366 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.350351 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94"} err="failed to get container status \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": rpc error: code = NotFound desc = could not find container \"27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94\": container with ID starting with 27013a528654fb2f81377f07a07d0f1a0326b4db65871e2a273178486882cd94 not found: ID does not exist" Apr 17 11:20:50.356412 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356392 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:50.356641 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356629 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-web" Apr 17 11:20:50.356677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356643 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-web" Apr 17 11:20:50.356677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356651 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-thanos" Apr 17 11:20:50.356677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356658 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-thanos" Apr 17 11:20:50.356677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356669 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="prometheus" Apr 17 11:20:50.356677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356677 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="prometheus" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356683 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="config-reloader" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356688 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="config-reloader" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356699 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="init-config-reloader" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356704 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="init-config-reloader" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356735 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356743 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356754 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="thanos-sidecar" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356761 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="thanos-sidecar" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356801 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="config-reloader" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356808 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-thanos" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356816 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="prometheus" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356821 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy-web" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356830 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="kube-rbac-proxy" Apr 17 11:20:50.356845 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.356838 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" containerName="thanos-sidecar" Apr 17 11:20:50.360969 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.360955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.362954 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.362937 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:20:50.363031 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363018 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f2pjp4j84f30v\"" Apr 17 11:20:50.363203 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363185 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:20:50.363203 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363196 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:20:50.363483 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363468 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:20:50.363555 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363541 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:20:50.363628 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363555 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:20:50.363628 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363587 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:20:50.363733 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363588 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:20:50.363733 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363649 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:20:50.363733 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363720 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:20:50.363852 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363734 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-49ltg\"" Apr 17 11:20:50.363964 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.363949 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:20:50.366915 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.366897 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:20:50.369450 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.369432 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:20:50.373043 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.373024 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:50.412458 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412458 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qblg2\" (UniqueName: \"kubernetes.io/projected/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-kube-api-access-qblg2\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-web-config\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412617 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-config\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.412792 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.413021 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.413021 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.412874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-config-out\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.513844 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.513966 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.513966 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.513966 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-config-out\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.513966 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.513966 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qblg2\" (UniqueName: \"kubernetes.io/projected/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-kube-api-access-qblg2\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.513966 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-web-config\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.513996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-config\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514273 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514768 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.514885 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.514765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.516083 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.515465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.516083 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.515712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.516925 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.516899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.517714 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.517209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.517714 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.517369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.517714 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.517514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.517714 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.517676 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.518222 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.518178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-config\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.518609 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.518583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.518836 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.518819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.518911 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.518884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-web-config\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.519203 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.519173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.519276 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.519216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.519328 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.519313 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.519973 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.519957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-config-out\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.528396 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.528376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qblg2\" (UniqueName: \"kubernetes.io/projected/e64253e2-f0f6-4f86-8d5b-7e357e8a0951-kube-api-access-qblg2\") pod \"prometheus-k8s-0\" (UID: \"e64253e2-f0f6-4f86-8d5b-7e357e8a0951\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.671369 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.671345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:50.796826 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:50.796787 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:50.800392 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:20:50.800365 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64253e2_f0f6_4f86_8d5b_7e357e8a0951.slice/crio-47e4d9d88bdf32c7b8a17858efe3d86e0e7302797e7d9da2c77309b2d8c1033f WatchSource:0}: Error finding container 47e4d9d88bdf32c7b8a17858efe3d86e0e7302797e7d9da2c77309b2d8c1033f: Status 404 returned error can't find the container with id 47e4d9d88bdf32c7b8a17858efe3d86e0e7302797e7d9da2c77309b2d8c1033f Apr 17 11:20:51.298719 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:51.298690 2577 generic.go:358] "Generic (PLEG): container finished" podID="e64253e2-f0f6-4f86-8d5b-7e357e8a0951" containerID="d95b76153522fbeb2442f7d29abea9148f57be5a31e79b7bc740148982e7ac9d" exitCode=0 Apr 17 11:20:51.298875 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:51.298785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerDied","Data":"d95b76153522fbeb2442f7d29abea9148f57be5a31e79b7bc740148982e7ac9d"} Apr 17 11:20:51.298875 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:51.298831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerStarted","Data":"47e4d9d88bdf32c7b8a17858efe3d86e0e7302797e7d9da2c77309b2d8c1033f"} Apr 17 11:20:51.539564 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:51.539532 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05" path="/var/lib/kubelet/pods/0e1bf0c1-1bd8-42e2-b11e-c43388d9ae05/volumes" Apr 17 11:20:52.305939 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:52.305906 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerStarted","Data":"40d9e3071c9c9ca6f1006bef53ca2eb4887e1713313dd957aa3875442e924122"} Apr 17 11:20:52.305939 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:52.305940 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerStarted","Data":"a3bbd4edb3f8e7a8fb0bc41a61e25fcd6696449f6a937db19a33be63b10d3f64"} Apr 17 11:20:52.305939 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:52.305950 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerStarted","Data":"19e1ff783bf9ed5d121c8bbb2846ba4adaa0b71bb1476569d4d6df11c0483fdc"} Apr 17 11:20:52.306374 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:52.305959 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerStarted","Data":"61da8410da8936901d8abf081f0bc19e383e9ecedcc221f7a4530ef268ef8ba9"} Apr 17 11:20:52.306374 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:52.305967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerStarted","Data":"335ab44bf252463f2184e3eac35c5615db8611f44bbc3571e41d9cf3aee02642"} Apr 17 11:20:52.306374 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:52.305975 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e64253e2-f0f6-4f86-8d5b-7e357e8a0951","Type":"ContainerStarted","Data":"a91362c172980272f32b2097dbafded1cc0819034a955c84c9e8fcfc7aa06370"} Apr 17 11:20:52.334802 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:52.334755 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.334739872 podStartE2EDuration="2.334739872s" podCreationTimestamp="2026-04-17 11:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:52.334023489 +0000 UTC m=+259.382193378" watchObservedRunningTime="2026-04-17 11:20:52.334739872 +0000 UTC m=+259.382909828" Apr 17 11:20:55.671822 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:20:55.671789 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:21:33.416422 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:21:33.416392 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:21:33.416952 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:21:33.416928 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:21:33.422568 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:21:33.422549 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:21:50.671828 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:21:50.671794 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:21:50.686837 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:21:50.686813 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:21:51.470126 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:21:51.470073 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:22:45.309642 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.309564 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xhkcm"] Apr 17 11:22:45.312235 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.312218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.314385 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.314364 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:22:45.326371 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.326343 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xhkcm"] Apr 17 11:22:45.428624 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.428590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-original-pull-secret\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.428624 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.428626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-kubelet-config\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.428804 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.428641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-dbus\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.529557 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.529530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-original-pull-secret\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.529557 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.529561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-kubelet-config\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.529732 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.529577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-dbus\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.529732 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.529650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-kubelet-config\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.529830 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.529749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-dbus\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.531902 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.531871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/294afbe2-7ccd-45e2-bc90-9e52ba9168f9-original-pull-secret\") pod \"global-pull-secret-syncer-xhkcm\" (UID: \"294afbe2-7ccd-45e2-bc90-9e52ba9168f9\") " pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.626301 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.626275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xhkcm" Apr 17 11:22:45.740450 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.740422 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xhkcm"] Apr 17 11:22:45.743540 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:22:45.743509 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod294afbe2_7ccd_45e2_bc90_9e52ba9168f9.slice/crio-e277387c01d63cd334e379301426deba5156df72147652cc622fb569d9f50318 WatchSource:0}: Error finding container e277387c01d63cd334e379301426deba5156df72147652cc622fb569d9f50318: Status 404 returned error can't find the container with id e277387c01d63cd334e379301426deba5156df72147652cc622fb569d9f50318 Apr 17 11:22:45.745205 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:45.745190 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:22:46.591743 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:46.591693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xhkcm" event={"ID":"294afbe2-7ccd-45e2-bc90-9e52ba9168f9","Type":"ContainerStarted","Data":"e277387c01d63cd334e379301426deba5156df72147652cc622fb569d9f50318"} Apr 17 11:22:49.601779 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:49.601748 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xhkcm" event={"ID":"294afbe2-7ccd-45e2-bc90-9e52ba9168f9","Type":"ContainerStarted","Data":"1e494e8fab1fc90e918f6ba9c76199339c1c79f3d7989f7f4ba6347d82def5b6"} Apr 17 11:22:49.633549 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:22:49.633505 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xhkcm" podStartSLOduration=1.071772452 podStartE2EDuration="4.633489229s" podCreationTimestamp="2026-04-17 11:22:45 +0000 UTC" firstStartedPulling="2026-04-17 11:22:45.745311814 +0000 UTC m=+372.793481680" lastFinishedPulling="2026-04-17 11:22:49.307028576 +0000 UTC m=+376.355198457" observedRunningTime="2026-04-17 11:22:49.633167169 +0000 UTC m=+376.681337054" watchObservedRunningTime="2026-04-17 11:22:49.633489229 +0000 UTC m=+376.681659117" Apr 17 11:26:33.433450 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:33.433421 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:26:33.435083 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:33.435061 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:26:48.859653 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:48.859611 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xhkcm_294afbe2-7ccd-45e2-bc90-9e52ba9168f9/global-pull-secret-syncer/0.log" Apr 17 11:26:48.908003 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:48.907968 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-46xz6_1b74d479-a57e-4b33-8dc6-cd4321d01595/konnectivity-agent/0.log" Apr 17 11:26:49.053912 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:49.053880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-247.ec2.internal_c775281676decdc6d9802c88c4684de2/haproxy/0.log" Apr 17 11:26:52.615160 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:52.615133 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgcx6_ab3076be-7073-467c-8065-01e1192ff5f6/node-exporter/0.log" Apr 17 11:26:52.637354 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:52.637327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgcx6_ab3076be-7073-467c-8065-01e1192ff5f6/kube-rbac-proxy/0.log" Apr 17 11:26:52.662037 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:52.662016 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgcx6_ab3076be-7073-467c-8065-01e1192ff5f6/init-textfile/0.log" Apr 17 11:26:52.951818 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:52.951744 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e64253e2-f0f6-4f86-8d5b-7e357e8a0951/prometheus/0.log" Apr 17 11:26:52.991310 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:52.991286 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e64253e2-f0f6-4f86-8d5b-7e357e8a0951/config-reloader/0.log" Apr 17 11:26:53.019475 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:53.019455 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e64253e2-f0f6-4f86-8d5b-7e357e8a0951/thanos-sidecar/0.log" Apr 17 11:26:53.044953 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:53.044935 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e64253e2-f0f6-4f86-8d5b-7e357e8a0951/kube-rbac-proxy-web/0.log" Apr 17 11:26:53.070962 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:53.070940 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e64253e2-f0f6-4f86-8d5b-7e357e8a0951/kube-rbac-proxy/0.log" Apr 17 11:26:53.096652 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:53.096630 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e64253e2-f0f6-4f86-8d5b-7e357e8a0951/kube-rbac-proxy-thanos/0.log" Apr 17 11:26:53.130801 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:53.130781 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e64253e2-f0f6-4f86-8d5b-7e357e8a0951/init-config-reloader/0.log" Apr 17 11:26:56.096752 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.096723 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv"] Apr 17 11:26:56.099626 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.099612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.101623 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.101601 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vp6ck\"/\"default-dockercfg-hjq8z\"" Apr 17 11:26:56.101784 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.101768 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vp6ck\"/\"openshift-service-ca.crt\"" Apr 17 11:26:56.104370 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.104353 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vp6ck\"/\"kube-root-ca.crt\"" Apr 17 11:26:56.124512 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.124491 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv"] Apr 17 11:26:56.220469 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.220436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-lib-modules\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.220469 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.220476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-podres\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.220704 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.220494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-sys\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.220704 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.220580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-proc\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.220704 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.220605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcbt\" (UniqueName: \"kubernetes.io/projected/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-kube-api-access-pdcbt\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321348 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcbt\" (UniqueName: \"kubernetes.io/projected/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-kube-api-access-pdcbt\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321540 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-lib-modules\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321540 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-podres\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321540 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-sys\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321540 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-proc\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321712 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-proc\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321712 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-podres\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321712 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-lib-modules\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.321712 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.321591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-sys\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.331904 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.331881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcbt\" (UniqueName: \"kubernetes.io/projected/6b4e26f5-9e5c-45a0-a96c-f98c353b5a67-kube-api-access-pdcbt\") pod \"perf-node-gather-daemonset-bwnbv\" (UID: \"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.409062 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.408985 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:56.535680 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:56.535627 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv"] Apr 17 11:26:56.538993 ip-10-0-142-247 kubenswrapper[2577]: W0417 11:26:56.538956 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b4e26f5_9e5c_45a0_a96c_f98c353b5a67.slice/crio-aff0c884d3170ae1bf96a872a27f98e9c32a0822b04ce30b0282210d7d5d5212 WatchSource:0}: Error finding container aff0c884d3170ae1bf96a872a27f98e9c32a0822b04ce30b0282210d7d5d5212: Status 404 returned error can't find the container with id aff0c884d3170ae1bf96a872a27f98e9c32a0822b04ce30b0282210d7d5d5212 Apr 17 11:26:57.047048 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.047021 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2n6g2_997fc9f1-c739-43ee-8f0f-bcc328a8b37f/dns/0.log" Apr 17 11:26:57.073973 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.073945 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2n6g2_997fc9f1-c739-43ee-8f0f-bcc328a8b37f/kube-rbac-proxy/0.log" Apr 17 11:26:57.233085 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.233051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" event={"ID":"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67","Type":"ContainerStarted","Data":"ca99415aaaeeda92f820789b39e909d843581a19a1e0de576632d626f8a0bfbe"} Apr 17 11:26:57.233085 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.233083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" event={"ID":"6b4e26f5-9e5c-45a0-a96c-f98c353b5a67","Type":"ContainerStarted","Data":"aff0c884d3170ae1bf96a872a27f98e9c32a0822b04ce30b0282210d7d5d5212"} Apr 17 11:26:57.233590 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.233222 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:26:57.450470 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.450445 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hq6cq_cd45e2c2-be74-4898-855b-e51a00ea7a92/dns-node-resolver/0.log" Apr 17 11:26:57.908031 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.907996 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-869d7b8bf7-cxst8_58952672-09f4-40b6-ac40-4902268e9ea4/registry/0.log" Apr 17 11:26:57.941749 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:57.941716 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mbc4v_1ad8d75d-d9b5-45d3-ada0-68b4d648c30f/node-ca/0.log" Apr 17 11:26:59.344174 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:59.344141 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nd4tb_dd572e4e-2d5a-47ec-8142-2e264dae6c8b/serve-healthcheck-canary/0.log" Apr 17 11:26:59.915309 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:59.915286 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9mvrz_a4d42c8e-cb41-4cbc-847e-55eedee9b9c1/kube-rbac-proxy/0.log" Apr 17 11:26:59.941771 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:59.941750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9mvrz_a4d42c8e-cb41-4cbc-847e-55eedee9b9c1/exporter/0.log" Apr 17 11:26:59.970549 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:26:59.970523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9mvrz_a4d42c8e-cb41-4cbc-847e-55eedee9b9c1/extractor/0.log" Apr 17 11:27:03.244668 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:03.244635 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" Apr 17 11:27:03.262907 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:03.262857 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-bwnbv" podStartSLOduration=7.262843565 podStartE2EDuration="7.262843565s" podCreationTimestamp="2026-04-17 11:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:26:57.276645857 +0000 UTC m=+624.324815746" watchObservedRunningTime="2026-04-17 11:27:03.262843565 +0000 UTC m=+630.311013452" Apr 17 11:27:08.274722 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.274696 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2d4x_e6311d15-5f59-4e6c-8732-269f06b40c16/kube-multus-additional-cni-plugins/0.log" Apr 17 11:27:08.299224 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.299199 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2d4x_e6311d15-5f59-4e6c-8732-269f06b40c16/egress-router-binary-copy/0.log" Apr 17 11:27:08.323971 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.323952 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2d4x_e6311d15-5f59-4e6c-8732-269f06b40c16/cni-plugins/0.log" Apr 17 11:27:08.349768 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.349750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2d4x_e6311d15-5f59-4e6c-8732-269f06b40c16/bond-cni-plugin/0.log" Apr 17 11:27:08.374889 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.374868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2d4x_e6311d15-5f59-4e6c-8732-269f06b40c16/routeoverride-cni/0.log" Apr 17 11:27:08.399577 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.399550 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2d4x_e6311d15-5f59-4e6c-8732-269f06b40c16/whereabouts-cni-bincopy/0.log" Apr 17 11:27:08.423677 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.423659 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2d4x_e6311d15-5f59-4e6c-8732-269f06b40c16/whereabouts-cni/0.log" Apr 17 11:27:08.456337 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.456269 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xxs_002766c9-b94d-4afa-a980-2f7abc5b32d2/kube-multus/0.log" Apr 17 11:27:08.596639 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.596612 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tl874_0caad504-c16e-477e-b9a9-80928417640e/network-metrics-daemon/0.log" Apr 17 11:27:08.620995 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:08.620968 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tl874_0caad504-c16e-477e-b9a9-80928417640e/kube-rbac-proxy/0.log" Apr 17 11:27:09.435059 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.435027 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-controller/0.log" Apr 17 11:27:09.455072 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.455048 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/0.log" Apr 17 11:27:09.460780 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.460760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovn-acl-logging/1.log" Apr 17 11:27:09.496876 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.496850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/kube-rbac-proxy-node/0.log" Apr 17 11:27:09.521257 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.521240 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:27:09.542834 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.542818 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/northd/0.log" Apr 17 11:27:09.565345 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.565329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/nbdb/0.log" Apr 17 11:27:09.588909 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.588895 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/sbdb/0.log" Apr 17 11:27:09.755865 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:09.755838 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65xlv_e5e299b9-9fdc-4122-ab7a-5d4a2753c88e/ovnkube-controller/0.log" Apr 17 11:27:11.708638 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:11.708612 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9975j_08f32961-6393-4bcc-a8bf-c27e9df01e0e/network-check-target-container/0.log" Apr 17 11:27:12.733356 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:12.733301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qdhpl_207babed-420b-4305-9046-6bc8fb348f3f/iptables-alerter/0.log" Apr 17 11:27:13.511331 ip-10-0-142-247 kubenswrapper[2577]: I0417 11:27:13.511304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-w6wn4_2c85040c-9a42-47fe-bdd4-7a0d5418502a/tuned/0.log"